I am working on a project where I need to transfer thousands of files (each sized between 50-60 MB) every hour from an SFTP server to local storage or AWS S3. I am using Apache Spark 3.5 with Scala 2.12 for distributed processing.
I tried using the spark-sftp library, but it seems to be discontinued and incompatible with Spark 3.x. Currently, I am using SSH in Scala to transfer files sequentially, but this approach is causing delays due to single-threaded file transfers.
I want to implement parallel processing for transferring files from SFTP to local storage or AWS S3. Are there any alternative approaches compatible with Spark 3.x that can help achieve this? How can I optimize file transfers for better performance?
-
1Asking for 3rd-party tools like libraries is 100% off-topic for this site. I stripped this part out of the question, maybe it helps to prevent the question getting closed. It still looks somewhat tool-specific to me, though.Doc Brown– Doc Brown2025年07月08日 10:49:02 +00:00Commented Jul 8 at 10:49