Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

feat(ui): add Token Estimator link to footer #337

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
HmbleCreator wants to merge 20 commits into coderamp-labs:main
base: main
Choose a base branch
Loading
from HmbleCreator:main

Conversation

Copy link

@HmbleCreator HmbleCreator commented Jun 29, 2025

Summary

This PR adds a "Token Estimator" link to the footer of the Gitingest web frontend. The link points to https://gitingest.com/tokencount, allowing users to easily access a tool for estimating token counts in pasted text.

Details

  • UI Change:

    • Added a new link labeled Token Estimator (with a counter icon) to the left column of the footer, alongside the Extension and Python package links.
    • The link opens in a new tab and is styled consistently with the other resource links.
    • No changes were made to the CLI or Python package.
  • Why:

  • How to test:

    1. Start the server (cd src && uvicorn server.main:app --reload).
    2. Open the app in your browser.
    3. Scroll to the footer and verify the "Token Estimator" link appears and opens https://gitingest.com/tokencount in a new tab.

Screenshot

Screenshot 2025年06月29日 205833


Closes #318

- Adds a "Token Estimator" link (with icon) to the left column of the site footer.
- Link points to https://gitingest.com/tokencount and opens in a new tab.
- Styled consistently with other resource links.
- No changes to CLI or Python package.
Closes coderamp-labs#318 
Copy link
Member

@HmbleCreator Thanks,
But this is just the link, it seems to me that #318 would require to actually implement the backend route for this new function, do you think you can work on this?

Copy link
Author

HmbleCreator commented Jun 30, 2025
edited
Loading

@HmbleCreator Thanks, But this is just the link, it seems to me that #318 would require to actually implement the backend route for this new function, do you think you can work on this?

Thanks for pointing that out! Yes, I’d be happy to implement the backend route for token estimation as part of #318. I can add an endpoint (e.g. /api/estimate-tokens) that accepts raw text, estimates the token count using tiktoken or a similar utility, and returns the result. Let me know if you have a preferred response schema or want it namespaced differently; I'd be happy to get started.

Copy link
Author

HmbleCreator commented Jun 30, 2025
edited
Loading

@HmbleCreator Thanks, But this is just the link, it seems to me that #318 would require to actually implement the backend route for this new function, do you think you can work on this?

Thanks for pointing that out! Yes, I’d be happy to implement the backend route for token estimation as part of #318. I can add an endpoint (e.g. /api/estimate-tokens) that accepts raw text, estimates the token count using tiktoken or a similar utility, and returns the result. Let me know if you have a preferred response schema or want it namespaced differently; I'd be happy to get started.

Are we only gonna have support for OpenAI models or other models also, like open source models available via HF?

... text
- Add GET endpoint for /api/tokencount with interactive documentation
- Include API usage examples and interactive form for testing
... text
- Add GET endpoint for /api/tokencount with interactive documentation
- Include API usage examples and interactive form for testing
Copy link
Author

The token counting functionality uses two efficient tokenizer libraries:

  • tiktokenizer for OpenAI models - provides fast and accurate token counting for GPT models
  • autotiktokenizer for non-OpenAI models - enables efficient token counting across a wide range of open-source models

This dual approach ensures we can provide fast and accurate token counting regardless of the model being used, while keeping the implementation lightweight and efficient.

- Add tokencount_api.jinja for styled token estimator documentation
- Update /api/tokencount GET endpoint to render the new template
- Documentation now matches the look and feel of other API docs
Copy link
Contributor

ix-56h commented Jun 30, 2025
edited
Loading

I'll test it out in few hours.

I see that both logic and routing are defined and implemented within the entrypoint of the server, maybe move this to the "routers/" folder.

filipchristiansen and others added 4 commits June 30, 2025 22:11
...count, and backend cleanup
- Removed GET /api/tokencount route that rendered a Jinja template; /api/tokencount is now POST-only and returns JSON.
- Renamed and refactored tokencount_api.jinja to tokencount.jinja, using a select dropdown for model selection and matching git_form.jinja UX.
- Updated footer link to point to /tokencount (user-facing form) instead of /api/tokencount.
- Ensured /tokencount is the user-facing form (GET/POST, Jinja) and /api/tokencount is API-only (POST, JSON).
- Updated dependency management: added autotiktokenizer to requirements.txt and pyproject.toml, and added a guideline to CONTRIBUTING.md.
- Cleaned up unused imports and dead code in backend.
Copy link
Author

Hi!
I wanted to ask—does the UI theme/design you use for Gitingest have a name or is it based on a particular design system? I liked the style; it’s clean, modern, and playful!
Additionally, I wanted to express my gratitude for your guidance and patience over the past two to three days. As a newbie, it was a great experience learning about FastAPI, backend/frontend separation, API best practices, and collaborative open-source workflows. I learned a great deal and truly appreciate the opportunity to contribute!
Thanks again!

Copy link
Contributor

ix-56h commented Jul 1, 2025
edited
Loading

Hi! I wanted to ask—does the UI theme/design you use for Gitingest have a name or is it based on a particular design system?

Yes, neobrutalism, you can find it at neobrutalism.com.

I wanted to express my gratitude for your guidance and patience over the past two to three days. As a newbie, it was a great experience learning about FastAPI, backend/frontend separation, API best practices, and collaborative open-source workflows. I learned a great deal and truly appreciate the opportunity to contribute! Thanks again!

You're welcome :)

Copy link
Author

hey, is there any issue with the merging, any issue with code?

Copy link
Contributor

ix-56h commented Jul 25, 2025

@HmbleCreator i'm sorry, we're late on this one.

We currently have some urgent topic and we don't know if we still want to add a dedicated tiktoken feature to the frontend app.

I'll put this in draft for now.

Copy link

This pull request has merge conflicts that must be resolved before it can be merged.

Copy link

github-actions bot commented Aug 1, 2025

This pull request has resolved merge conflicts and is ready for review.

Copy link

github-actions bot commented Aug 5, 2025

This pull request has merge conflicts that must be resolved before it can be merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Reviewers

@ix-56h ix-56h Awaiting requested review from ix-56h

Requested changes must be addressed to merge this pull request.

Assignees
No one assigned
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

feat: include a token estimator utility

AltStyle によって変換されたページ (->オリジナル) /