Getting Autocomplete to work in WebStorm

I love my JetBrains IDEs and would like to get autocomplete to work with WebStorm. I have tried adding the js files from the SparkAR install directory to the libraries, but no luck getting meaningful documentation or autocomplete to work. I’ve seen it work better on VS Code (though not always, for some reason?)
Did anyone else try this, and maybe get it to work?
If no, do you have an idea what I could try?
Thank you!

I’m not familiar with that IDE, but in VSC I use in line JSDocs to get auto complete to work. I think you can also add your own .ts files with descriptions but two separate files to maintain and monitor is too much for me

1 Like

Hey @Tomas_Pietravallo, thanks for the hint! As far as I understand it, JSDoc is used for generating documentation for my own methods etc, like javadoc? I’m looking for a way to get even the “standard” autocomplete to work, with the SparkAR-standard types and methods… or am I misunderstanding?

Hi @monogon, yes that’s true, but you can also specify the types of variables. E.g.:

/** @type { SceneObjectBase } */
let something = await Scene.root.findFirst('something');

The code above will get an object from the scene and you will working autocomplete and Intellisense with the something variable (at least with VSC). Note that the JSDoc and the actual type may not match, E.g.:

/** @type { Boolean } */
let a = 'a string'

The code will still run but VSC will complain. In the example with { SceneObjectBase } it doesn’t complain because the return is Promise.<any> meaning the JSDocs could say anything and it wouldn’t complain, but of course you want the type of SceneObjectBase and not some other thing


Very cool, thanks. Do you have any idea how to get the built-in to work as well? That’s my original question… or is that not even possible?
How do I get for example Diagnostics.log() to autocomplete?
Also I’m missing hover-over documentation… I can enter a URL for an external library, and I tried using so far no luck. It just doesn’t seem to recognize the SparkAR built-in library at all (or framework or whatever it is…)

I got it to work by defining a “scope” for my project. For that I clicked Edit Scopes like in the screenshot, created a new “local scope” and then selected the autogenerated tsconfig and my project file(s). It works absolutely beautifully. I didn’t yet get the docs to work, but I think I’ll figure it out somehow.

I still have the question of how JS and typescript interact in SparkAR code, maybe someone knows more? The way I understand it now, the scripts are written in pure JS, and the TS is only there for this config file? (Like for example Groovy or Kotlin are used for configuring Gradle in a Java project)

Or is there a way to use TypeScript? I don’t know much about TS (as you can maybe tell) but would like to learn more, and if I can do that while making filters that would be even better.

PS: Also, @Tomas_Pietravallo, your posts just became supervaluable to me as well because now creating docs for my own functions finally makes sense! This is great, thanks again! (In your first example there is a dot . after type though, which I also copypasted… maybe you can edit that out for future generations)

Here is a link on how to format the @type line:

1 Like

Ooops yes you are right, that dot should be there, I was directly writting with the keyboard I wasn’t copying from any actual code so that slipped in :sweat_smile:.

I don’t think Spark supports typescript although as I understand it, the internals ship any .ts file like any .js file. I think @josh_beckwith has explored typescript in Spark so maybe he has some more knowledge on this topic

1 Like

Here’s an example of how to use the typescript compiler in spark, although I couldn’t get the typedefs to actually work in VSCode


Unfortunately, on some of my projects, the typedefs don’t get generated in that folder for some reason… so I have to copy over old ones, probably incorrect ones eg from v98 to v100…
Does anyone know when they get generated? What did I do last time so the skylight-typedefs directory had all the stubs in it…?
As for the typescript, I had a look at your repo, @josh_beckwith, thanks a bunch for that! Haven’t yet tried it, but the way I understood it is you have this extra directory with your actual TS in it, and then that somehow gets transpiled into the “usual” scripts folder in the SparkAR-project-directory, is that correct? Sounds good! And then the typedefs didn’t work even though they normally do for you?

EDIT: The missing typedefs might be a v101 bug… I found someone mentioning it in the FB group today. I’ll copy the v100 typedefs for now.

The typedefs worked to a degree, but for some reason there are a bunch of them missing so it falls apart pretty quickly.