Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Optimization Fetch() and FetchAsync() #1119

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
BFuerchau wants to merge 3 commits into FirebirdSQL:master from BFuerchau:master
Closed

Optimization Fetch() and FetchAsync() #1119

BFuerchau wants to merge 3 commits into FirebirdSQL:master from BFuerchau:master

Conversation

@BFuerchau
Copy link

@BFuerchau BFuerchau commented Jun 28, 2023

Advantage

  • less object creation in fetch
  • less GC activities
    In previous version for each row are n times DbValue-objects created for each row.
    Now it will create only the object[] array to store raw values.
    When the fetch is done, the object[] is copied to the single DbValue[] array.
    The reader gets every time the same row with the new values.
    The object[] array must be destroyed from each row, as DbValue[] array previous.
    At the end only n DbValue objects must destroyed.
    In case of 100, thousends or more rows you save time with creation and GC which can make until 10% or more of performance.

BFuerchau added 2 commits June 29, 2023 00:16
Performanceoptimizing
- object[] instead of DbValue[] in _rows
- single _DbValues[]
Advantage
- less object creation
- less GC
- faster load _rows
Corrspponding overrides from version 10 GdsStatement.cs
Copy link
Author

BFuerchau commented Jun 28, 2023
edited
Loading

I hope i didn't have transmission errors.
It seams i can't create a separate pull request for the FbDataReader.
Can you help once more?

Optimized GetValues()
Optimized GetSchemaTable/GetSchemaTableAsync
Instead of request each feld with separate server request, all fields from all known tables get loaded in a single request.
May be sometimes it is more loaded, but 1 request and process than the fields is faster as n requests.
In case of auto transaction each request has a single implicit transaction.
Copy link
Member

This PR is having unrelated changes in FbDataReader from other PR.

}
}

[MethodImpl(MethodImplOptions.AggressiveInlining)]
Copy link
Member

@cincuranet cincuranet Jul 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why aggressive inlining?

Copy link
Author

I have recognnized, that the call needs really time.
Without inlining the speed goes a little bit down.

Copy link
Member

But did you consider other effects of inlining? The JIT is doing great job 99% of the time with its heuristics on general code. I can make numbers nice in (micro)benchmarks, but that doesn't mean the benefit will be there for real-world scenario.

Copy link
Author

I can't manage to put the 2 changes of GdsStatement version 10 and 13 in exactly 1 commit, since version 13 has to make an override.

Copy link
Member

Multiple commits into GdsStatements is fine. I'm talking about changes in FbDataReader.

Copy link
Author

I test with 1000 and more rows per result.
What's the harm if the inlining is active;-)?

Ok, but how can i make the FbDataReader a separate request?
Should i fork the project a second time only for the reader?

Copy link
Member

What's the harm if the inlining is active;-)?

https://en.wikipedia.org/wiki/Inline_expansion

Ok, but how can i make the FbDataReader a separate request?

Use different branch.

Copy link
Author

I have tried a different branch, but the 2 commits from GdsStatements are than also included.
I didn't know how to do this, as you can see this is a second try.

I'm just overwhelmed. So far I've only worked with a local git with a single branch and multiple changes are packed into one commit.
I just don't know what to do.
Is it perhaps better to do a separate fork with exactly this 1 change?

Copy link
Member

Is it perhaps better to do a separate fork with exactly this 1 change?

Yes. This is the best practice, @BFuerchau.

From latest master, create one branch for each of your (independent) changes. Then push each of them and submit a different PR.

Coming late to the party, but I presume this is what @cincuranet is asking you to do.

Copy link
Member

So far I've only worked with a local git with a single branch and multiple changes are packed into one commit.

Tip: You are losing a lot of what git could offer you.

Make smaller changes, commit often. Use branches like hell. They are extremely light in git (unlike other-revision-control-systems-of-the-past-who-must-not-be-named).

Also a good git client may help you.

And Beyond Compare. I simply cannot work without it. Best 70 USD I already spent in my life.

Copy link
Author

Thank you for your tips. I'm a single developer with only a few small projects. We have a gitlab server, but i don't like the overhead. I'm using git only as additional source save;-).

So for the requests i will try it for a last time. If it also don't work i have no idea to do so.
I can than only upload the my changes for 3 sourcefiles and may be another one will take this changes.
For me, i can live with my own version of netprovider;-) to load fast resultsets.

Copy link
Author

I will try it the last time with a new request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Reviewers

@cincuranet cincuranet cincuranet left review comments

Assignees

No one assigned

Labels

None yet

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

AltStyle によって変換されたページ (->オリジナル) /