0

I am working on a project where I need to transfer thousands of files (each sized between 50-60 MB) every hour from an SFTP server to local storage or AWS S3. I am using Apache Spark 3.5 with Scala 2.12 for distributed processing.

I tried using the spark-sftp library, but it seems to be discontinued and incompatible with Spark 3.x. Currently, I am using SSH in Scala to transfer files sequentially, but this approach is causing delays due to single-threaded file transfers.

I want to implement parallel processing for transferring files from SFTP to local storage or AWS S3. Are there any alternative approaches compatible with Spark 3.x that can help achieve this? How can I optimize file transfers for better performance?

Doc Brown
219k35 gold badges405 silver badges619 bronze badges
asked Jul 8 at 10:44
1
  • 1
    Asking for 3rd-party tools like libraries is 100% off-topic for this site. I stripped this part out of the question, maybe it helps to prevent the question getting closed. It still looks somewhat tool-specific to me, though. Commented Jul 8 at 10:49

0

Know someone who can answer? Share a link to this question via email, Twitter, or Facebook.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.