Skip to main content
Stack Overflow
  1. About
  2. For Teams
Filter by
Sorted by
Tagged with
0 votes
0 answers
27 views

I have a Cloudera cluster under knox gateway with basic authentication (username/password). I want to access HDFS (SWebHDFS) through SSL (https) using Java Apache Hadoop client library (Apache Hadoop ...
2 votes
0 answers
104 views

I have a WebHdfs API that successfully CREATES a file, but then when I try to APPEND a byte array of data to the same file I run into issues. I have no issues creating a file, so I know it is not a ...
0 votes
1 answer
809 views

I want to upload a file from local server to to HDFS via webHDFS REST API. Based on the documentation, this operation take two steps: Submit a HTTP PUT request, that return the location ...
Kate's user avatar
  • 165
0 votes
1 answer
285 views

I have a Hadoop cluster and I want to manipulate data from a Spring Boot microservice: Create folders / Put Data/ Read Data/ Delete Data... There is an API: https://hadoop.apache.org/docs/stable/...
Kate's user avatar
  • 165
1 vote
1 answer
731 views

I'm trying to install apache-airflow-providers-apache-hdfs library in my Airflow-Docker 2.5.3. I've installed all the necessary Kerberos' libs, and I got the following error: #0 5.236 Requirement ...
0 votes
1 answer
865 views

I've a Docker stack with an Apache Hadoop (version 3.3.4) cluster, composed by one namenode and two datanodes, and a container with both Kerberos admin server and Kerberos kdc. I'm trying to configure ...
1 vote
1 answer
3k views

How to perform HDFS operation in Airflow? make sure you install following python package pip install apache-airflow-providers-apache-hdfs #Code Snippet #Import packages from airflow import ...
0 votes
1 answer
513 views

I am using a single-node hadoop version release-3.3.1-RC3. In web ui hadoop under utilities -> browse the file system it is possible to view the contents of the file (beginning and end) directly in ...
Lemito's user avatar
  • 1
0 votes
1 answer
127 views

I was trying to read the file present in hadoop cluster through the following code. The default port used is 9000. (since at 50700, it is not getting connected) //webhdfs-read-tests.js // Include ...
shihack's user avatar
  • 53
0 votes
0 answers
703 views

I have setup a WebHDFS server with a self signed SSL certificate for testing. Now I need some kind of authentication on it where the user has to pass some credentials in the WebHDFS rest call. I am ...
0 votes
1 answer
611 views

vijay@ubuntu:~$ start-all.sh WARNING: Attempting to start all Apache Hadoop daemons as vijay in 10 seconds. WARNING: This is not a recommended production deployment configuration. WARNING: Use CTRL-C ...
vijays's user avatar
  • 1
0 votes
1 answer
837 views

I am trying to configure Hadoop with WebHDFS enabled, and then I also want to enable SSL on it. My hdfs-site.xml looks like this: <configuration> <property> <name>dfs....
Ufder's user avatar
  • 547
-1 votes
1 answer
182 views

How can I get the value of one or more keys in HDFS via HTTP or JAVA api from remote client? For example, the file below has a million keys and values. I just want to get the values of the 'phone' and ...
0 votes
1 answer
446 views

I am trying to use python to write to secure hdfs using the following lib link Authentication part: def init_kinit(): kinit_args = ['/usr/bin/kinit', '-kt', '/tmp/xx.keytab', '...
0 votes
0 answers
310 views

Does webhdfs carry out checksum verification? When I upload a file to my remote hadoop cluster using webhdfs, does it carry out checksum verification of the file before upload and after upload to ...

15 30 50 per page
1
2 3 4 5
...
18

AltStyle によって変換されたページ (->オリジナル) /