Security problem extra

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Security problem extra

ZongtianHou
This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps the client is running an  older version of Hadoop which does not support encryption

I have two more questions here.
1 what the client mean, it mean the application running on hdfs, how does it have a encryption?
2 I have turn off the encryption about data transfer, rpc protection, http protection by setting properties of  hadoop.rpc.protection, dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still encryption?

Any clue will be appreciated.
Reply | Threaded
Open this post in threaded view
|

Re: Security problem extra

ZongtianHou
Does anyone have some clue about it? I have updated the jdk, and still cannot solve the problem. Thx advance for any info!!
On 27 Jun 2018, at 12:23 AM, ZongtianHou <[hidden email]> wrote:

This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps the client is running an  older version of Hadoop which does not support encryption

I have two more questions here.
1 what the client mean, it mean the application running on hdfs, how does it have a encryption?
2 I have turn off the encryption about data transfer, rpc protection, http protection by setting properties of  hadoop.rpc.protection, dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still encryption?

Any clue will be appreciated.

Reply | Threaded
Open this post in threaded view
|

Re: Security problem extra

Wei-Chiu Chuang-3
Hi Zongtian,
This is definitely not a JDK issue. This is the wire-protocol compatibility between client and server (DataNode).

bq. what the client mean, it mean the application running on hdfs, how does it have a encryption?
I'm not quite sure what you asked. HDFS supports at-rest encryption, data transfer encryption, RPC encryption and SSL encryption.

I'd recommend you to make sure your Hadoop client version is the same as the server version. The log message suggests the DataNode is on Hadoop 2.7.0+ version.

On Wed, Jun 27, 2018 at 2:24 AM ZongtianHou <[hidden email]> wrote:
Does anyone have some clue about it? I have updated the jdk, and still cannot solve the problem. Thx advance for any info!!
On 27 Jun 2018, at 12:23 AM, ZongtianHou <[hidden email]> wrote:

This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps the client is running an  older version of Hadoop which does not support encryption

I have two more questions here.
1 what the client mean, it mean the application running on hdfs, how does it have a encryption?
2 I have turn off the encryption about data transfer, rpc protection, http protection by setting properties of  hadoop.rpc.protection, dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still encryption?

Any clue will be appreciated.



--
A very happy Clouderan
Reply | Threaded
Open this post in threaded view
|

RE: Security problem extra

Brahma Reddy Battula
In reply to this post by ZongtianHou

Unknowingly “dfs.encrypt.data.transfer” configured as true in datanode??

Please cross with check datanode configurations using the following way.

<a href="http://%3cDN_IP%3e:%3chttpport%3e/conf">http://<DN_IP>:<httpport>/conf

 

bq.1 what the client mean, it mean the application running on hdfs, how does it have a encryption?

Yes, application (client) which is connected to DN while handshake

 

From: ZongtianHou [mailto:[hidden email]]
Sent: 27 June 2018 14:54
To: [hidden email]
Subject: Re: Security problem extra

 

Does anyone have some clue about it? I have updated the jdk, and still cannot solve the problem. Thx advance for any info!!

On 27 Jun 2018, at 12:23 AM, ZongtianHou <[hidden email]> wrote:

 

This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps the client is running an  older version of Hadoop which does not support encryption

 

I have two more questions here.

1 what the client mean, it mean the application running on hdfs, how does it have a encryption?

2 I have turn off the encryption about data transfer, rpc protection, http protection by setting properties of  hadoop.rpc.protection, dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still encryption?

 

Any clue will be appreciated.

 

Reply | Threaded
Open this post in threaded view
|

Re: Security problem extra

ZongtianHou
The dfs.encrypt.data.transfer is false, and I have update the datanode to 2.7.6,the error info changed a little, it don’t have encryption.
2018-06-28 00:44:47,557 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expe    cted SASL data transfer protection handshake from client at /127.0.0.1:51513. Perhaps the client     is running an older version of Hadoop which does not support SASL data transfer protection
I use a libhdfs3 client to connect Hadoop, the function Hdfs::OutputStream::sync() fail, got the error log
TcpSocket.cpp: 79: HdfsEndOfStream: Read 8 bytes failed from ""127.0.0.1:61004"": End of the stream.
It seem can’t receive handshake info from datanode, still confused by it.

On 27 Jun 2018, at 10:33 PM, Brahma Reddy Battula <[hidden email]> wrote:

Unknowingly “dfs.encrypt.data.transfer” configured as true in datanode??
Please cross with check datanode configurations using the following way.
<a href="http://%3cDN_IP%3e:%3chttpport%3e/conf" style="color: purple; text-decoration: underline;" class="">http://<DN_IP>:<httpport>/conf
 
bq.1 what the client mean, it mean the application running on hdfs, how does it have a encryption?
Yes, application (client) which is connected to DN while handshake
 
From: ZongtianHou [[hidden email]] 
Sent: 27 June 2018 14:54
To: [hidden email]
Subject: Re: Security problem extra
 
Does anyone have some clue about it? I have updated the jdk, and still cannot solve the problem. Thx advance for any info!!
On 27 Jun 2018, at 12:23 AM, ZongtianHou <[hidden email]> wrote:
 
This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps the client is running an  older version of Hadoop which does not support encryption
 
I have two more questions here.
1 what the client mean, it mean the application running on hdfs, how does it have a encryption?
2 I have turn off the encryption about data transfer, rpc protection, http protection by setting properties of  hadoop.rpc.protection, dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still encryption?
 
Any clue will be appreciated.