Quantcast

Failed to set permissions of path

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Failed to set permissions of path

shlomi java
hi Hadoops & Nutchs,

I'm trying to run Nutch 1.4 *locally*, on Windows 7, using Hadoop
0.20.203.0.
I run with:
fs.default.name = D:\fs
hadoop.tmp.dir = D:\tmp
dfs.permissions = false
PATH environment variable contains C:\cygwin\bin.

I get the following exception:
Exception in thread "main" java.io.IOException: *Failed to set permissions
of path*: file:/D:/tmp/mapred/staging/username-835169260/.staging to *0700*
at org.apache.hadoop.fs.RawLocalFileSystem.*checkReturnValue*
(RawLocalFileSystem.java:525)
at org.apache.hadoop.fs.RawLocalFileSystem.*setPermission*
(RawLocalFileSystem.java:499)
at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:797)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:791)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:791)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:765)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1200)
at org.apache.nutch.crawl.Injector.inject(Injector.java:217)
at org.apache.nutch.crawl.Crawl.run(Crawl.java:127)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.Crawl.main(Crawl.java:55)

The call to *rv = f.setReadable(group.implies(FsAction.READ), false);*,
in RawLocalFileSystem.setPermission (*f* is java.io.File), returns false,
and that what causes checkReturnValue to throw the exception.
The above .staging folder DOES get created, only setting the permission
fails.

I also tried Hadoop's hadoop.job.ugi property, giving it different values,
with no success.

I'm posting in both forums, because I don't know where is the problem.

Do you? :-)

10X
ShlomiJ
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Failed to set permissions of path

shlomi java
(sending email again, because it seems it did not reach forum)

On Wed, Jan 11, 2012 at 12:09 PM, shlomi java <[hidden email]> wrote:

> hi Hadoops & Nutchs,
>
> I'm trying to run Nutch 1.4 *locally*, on Windows 7, using Hadoop
> 0.20.203.0.
> I run with:
> fs.default.name = D:\fs
> hadoop.tmp.dir = D:\tmp
> dfs.permissions = false
> PATH environment variable contains C:\cygwin\bin.
>
> I get the following exception:
> Exception in thread "main" java.io.IOException: *Failed to set
> permissions of path*:
> file:/D:/tmp/mapred/staging/username-835169260/.staging to *0700*
>  at org.apache.hadoop.fs.RawLocalFileSystem.*checkReturnValue*
> (RawLocalFileSystem.java:525)
> at org.apache.hadoop.fs.RawLocalFileSystem.*setPermission*
> (RawLocalFileSystem.java:499)
>  at
> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
> at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
>  at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:797)
>  at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:791)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Unknown Source)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>  at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:791)
> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:765)
>  at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1200)
> at org.apache.nutch.crawl.Injector.inject(Injector.java:217)
>  at org.apache.nutch.crawl.Crawl.run(Crawl.java:127)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.nutch.crawl.Crawl.main(Crawl.java:55)
>
> The call to *rv = f.setReadable(group.implies(FsAction.READ), false);*,
> in RawLocalFileSystem.setPermission (*f* is java.io.File), returns false,
> and that what causes checkReturnValue to throw the exception.
> The above .staging folder DOES get created, only setting the permission
> fails.
>
> I also tried Hadoop's hadoop.job.ugi property, giving it different values,
> with no success.
>
> I'm posting in both forums, because I don't know where is the problem.
>
> Do you? :-)
>
> 10X
> ShlomiJ
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Failed to set permissions of path

Vladimir Rozov
This error (specific to Windows) is caused by optimization introduced around
203 and it is still there in 1.0.0 :(. I don't know how to fix it other than
recompile Hadoop common with optimization removed from
RawLocalFileSystem.java

/**
* Use the command chmod to set permission.
*/
@Override
public void setPermission(Path p, FsPermission permission
    ) throws IOException {
  execSetPermission(pathToFile(p), permission);
}

Vlad


-----Original Message-----
From: shlomi java
Sent: Wednesday, January 11, 2012 6:46 AM
To: [hidden email]
Subject: Re: Failed to set permissions of path

(sending email again, because it seems it did not reach forum)

On Wed, Jan 11, 2012 at 12:09 PM, shlomi java <[hidden email]> wrote:

> hi Hadoops & Nutchs,
>
> I'm trying to run Nutch 1.4 *locally*, on Windows 7, using Hadoop
> 0.20.203.0.
> I run with:
> fs.default.name = D:\fs
> hadoop.tmp.dir = D:\tmp
> dfs.permissions = false
> PATH environment variable contains C:\cygwin\bin.
>
> I get the following exception:
> Exception in thread "main" java.io.IOException: *Failed to set
> permissions of path*:
> file:/D:/tmp/mapred/staging/username-835169260/.staging to *0700*
>  at org.apache.hadoop.fs.RawLocalFileSystem.*checkReturnValue*
> (RawLocalFileSystem.java:525)
> at org.apache.hadoop.fs.RawLocalFileSystem.*setPermission*
> (RawLocalFileSystem.java:499)
>  at
> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
> at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
>  at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:797)
>  at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:791)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Unknown Source)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>  at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:791)
> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:765)
>  at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1200)
> at org.apache.nutch.crawl.Injector.inject(Injector.java:217)
>  at org.apache.nutch.crawl.Crawl.run(Crawl.java:127)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.nutch.crawl.Crawl.main(Crawl.java:55)
>
> The call to *rv = f.setReadable(group.implies(FsAction.READ), false);*,
> in RawLocalFileSystem.setPermission (*f* is java.io.File), returns false,
> and that what causes checkReturnValue to throw the exception.
> The above .staging folder DOES get created, only setting the permission
> fails.
>
> I also tried Hadoop's hadoop.job.ugi property, giving it different values,
> with no success.
>
> I'm posting in both forums, because I don't know where is the problem.
>
> Do you? :-)
>
> 10X
> ShlomiJ
>

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Failed to set permissions of path

Radu
If you just want to test Hadoop on Windows the actual permissions are not that
important.  I updated RawLocalFileSystem.java so it just assigns some generous
value to all files everytime ignoring the actual value in the 'permission'
argument.

  /**
   * Use the command chmod to set permission.
   */
  @Override
  public void setPermission (Path p, FsPermission permission) throws IOException
  {
  //FileUtil.setPermission (pathToFile (p), permission);
    FileUtil.setPermission (pathToFile (p), permission777);
  }

  private FsPermission permission777 = FsPermission.valueOf ("-rwxrwxrwx");

Then rebuild Hadoop using a command like:

ant -l build.log -Dtest.output=yes test-contrib

See: http://wiki.apache.org/hadoop/FAQ#Building_.2BAC8_Testing_Hadoop_on_Windows

Replace the hadoop-core*.jar under 'share/hadoop' with hadoop-core*.SNAPSHOT.jar
created under 'build'

Loading...