It sounds like your browser is caching the larger, unoptimized version of the model and the allocation meter is counting it against the cap even though it has been deleted. Try deleting the large model and publishing the blank scene, and then adding the optimized version and seeing if that helps clear the cache.
That worked! Thanks!
I was working on an AR scene in the STYLY studio. I had uploaded some of my scanned models. They were quite large (high poly count) and also featured large, 8k textures. When I put them into my scene I noticed that the memory allocation meter jumped to over 100% so I deleted the model and opted, instead, to use optimized versions of my models.
However, even though the larger, unoptimized models had been deleted from my scene, when I put in the new, optimized versions, the memory allocation meter was still reporting over 100%. The only thing in the scene were my optimized model, and the default assets (Your Position, directional light, etc.) As expected, this scene seems to crash periodically and also will not publish properly.
Strangely, if I created a new scene and added the optimized model, the memory allocation meter seems to work correctly and I can publish it correctly. Any idea as to what is going on?