Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Performance issues with large Robot Framework projects #518

Unanswered
petrieh asked this question in Q&A
Discussion options

Hello all,

We’re running into performance and memory issues when using RobotCode on large Robot Framework projects.

I’ve uploaded a dummified version of our real project here:
👉 https://github.com/petrieh/dummified-robot

It’s about 460 MB of .robot and .resource files, trying to follow good design practices (no global resources, modular keywords, clear hierarchy). Even though it contains only simple keywords and calls, RobotCode takes ~30-40 minutes to analyze it and uses around 6 GB RAM, which isn’t freed afterward. The real project takes roughly 3 hours and uses up to 12 GB RAM.

This makes RobotCode hard to use in Dev Containers, since it seems to reanalyze everything each time the container starts, even when nothing changed. The extension data is mounted to the container and kept intact between restarts.

Component Version
RobotCode 1.9.0
VS Code 1.105.0
Python 3.9.21
Robot Framework 7.2.2
OS Rocky Linux 9.6

❓ Questions

  • Any way to speed up or optimize analysis for large projects?
  • Can analysis results be cached or reused between container sessions or lazy analysis used instead?
  • How can we profile or debug the analyzer to locate the bottlenecks?

I’m happy to help investigate or fix the issue — I’d just need some guidance on where to start inside the RobotCode codebase.

You must be logged in to vote

Replies: 2 comments 1 reply

Comment options

Obrigado por compartilhar esse caso tão detalhado! 👏 Também enfrentamos desafios semelhantes com projetos grandes em Robot Framework, então sua análise é valiosa.
Sobre suas perguntas:

  • O tempo de análise e uso de memória parecem excessivos, especialmente considerando que os arquivos são simples e bem estruturados. Isso sugere que o analisador pode estar processando mais do que deveria — talvez reanalisando dependências ou não otimizando o cache corretamente.
  • Sobre cache entre sessões de container: mesmo com o volume montado, o RobotCode pode estar ignorando o cache se detectar mudanças no ambiente. Seria interessante verificar se há alguma flag ou configuração que force a reanálise completa.
  • Para investigar o gargalo, recomendo ativar o modo de depuração do RobotCode (se disponível) ou usar ferramentas como py-spy ou memory_profiler para identificar onde o tempo e a RAM estão sendo consumidos. Focar nos arquivos que demoram mais para serem processados pode revelar padrões.
  • Sugestão de melhoria para o projeto: talvez implementar uma análise incremental ou lazy loading de arquivos possa ajudar. Se você estiver disposto a contribuir com isso, seria uma adição incrível para a comunidade!
You must be logged in to vote
1 reply
Comment options

Yes, I could start investigating this in more detail in beginning of 2026. Now a bit busy. I guess the easiest is to monkeypatch get_tokens of RF and check how many times it is called for each resource or test suite file.

Comment options

switched from robocorp/robotframework-lsp, to this extension, overall seems to work ok.

but there are some performance issues, find references Shift + F12 is not opening
process .vscode-server/extensions/d-biehl.robotcode-2.1.0 uses 100% CPU for 45+ minutes and keeps on going,
would be nice to see what it tries to do at least in extensions output in vscode

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /