[rudder-users] cf-agent aborted on defined class "could_not_download_uuid

Olivier Desport olivier.desport at ac-versailles.fr
Tue Dec 3 12:34:17 CET 2013





Le 03/12/2013 11:32, Jonathan Clarke a écrit :
> On 03/12/13 11:27, Olivier Desport wrote:
>>
>>
>>
>>
>> Le 03/12/2013 11:17, Jonathan Clarke a écrit :
>>> Hi Olivier,
>>>
>>> On 03/12/13 11:13, Olivier Desport wrote:
>>>>
>>>>
>>>>
>>>>
>>>> Le 03/12/2013 10:43, Matthieu CERDA a écrit :
>>>>> Le 02/12/2013 14:36, Olivier Desport a écrit :
>>>>>>
>>>>>> Le 02/12/2013 14:27, Matthieu CERDA a écrit :
>>>>>>> Le 02/12/2013 13:28, Olivier Desport a écrit :
>>>>>>>> OK, so the invocation looks good manually too.
>>>>>>>>>
>>>>>>>>> Can you please send me the output of "grep <NODE IP ADDRESS> 
>>>>>>>>> /var/log/rudder/apache2/access.log", to be executed on the 
>>>>>>>>> Rudder server ? (replacing the node ip address by the IP of 
>>>>>>>>> the node unable to be registered)
>>>>>>>> The output of the grep is empty.
>>>>>>> OK, and if you try with the Proxy IP ?
>>>>>> /usr/bin/curl  --proxy 'http://proxy-ip:port' 
>>>>>> http://172.31.136.121/uuid :
>>>>>>
>>>>>> it works (output : root)
>>>>> Hi Olivier,
>>>>>
>>>>> OK, I do think it might be a proxy issue. Can you please run these 
>>>>> commands to test if curl obeys the "no proxy" parameter ?
>>>>>
>>>>> First, on the client:
>>>>>
>>>>> - /usr/bin/curl --proxy '' -o "/var/rudder/tmp/uuid.txt" 
>>>>> http://172.31.0.61/uuid
>>>>> - unset http_proxy
>>>>> - unset ftp_proxy
>>>>> - unset https_proxy
>>>>> - unset no_proxy
>>>>> - /usr/bin/curl --proxy '' -o "/var/rudder/tmp/uuid.txt" 
>>>>> http://172.31.0.61/uuid
>>>>>
>>>>
>>>> It still doesn't work :
>>>>
>>>> % Total    % Received % Xferd  Average Speed   Time Time     Time  
>>>> Current
>>>>                                  Dload  Upload   Total Spent    
>>>> Left  Speed
>>>> 100  3159  100  3159    0     0   175k      0 --:--:-- --:--:-- 
>>>> --:--:--  192k
>>>
>>> What indicates to you that this doesn't work? This is the expected 
>>> output for the curl command. Can you look at the contents of the 
>>> file /var/rudder/tmp/uuid.txt? It should contain the string "root". 
>>> If it does, then this command is working.
>>
>> This file contains an html error page :
>>
>> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01//EN" 
>> "http://www.w3.org/TR/html4/strict.dtd">
>> <html><head>
>> <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
>> <title>ERROR: The requested URL could not be retrieved</title>
>> [...]
>> </head><body id=ERR_INVALID_URL>
>> <div id="titles">
>> <h1>ERROR</h1>
>> <h2>The requested URL could not be retrieved</h2>
>> </div>
>> <hr>
>>
>> <div id="content">
>> <p>The following error was encountered while trying to retrieve the 
>> URL: <a href="/uuid">/uuid</a></p>
>>
>> <blockquote id="error">
>> <p><b>Invalid URL</b></p>
>> </blockquote>
>>
>> <p>Some aspect of the requested URL is incorrect.</p>
>>
>> <p>Some possible problems are:</p>
>> <ul>
>> <li><p>Missing or incorrect access protocol (should be <q>http://</q> 
>> or similar)</p></li>
>> <li><p>Missing hostname</p></li>
>> <li><p>Illegal double-escape in the URL-Path</p></li>
>> <li><p>Illegal character in hostname; underscores are not 
>> allowed.</p></li>
>> </ul>
>>
>> <p>Your cache administrator is <a 
>> href="mailto:webmaster?subject=CacheErrorInfo%20-%20ERR_INVALID_URL&body=CacheHost%3A%20localhost%0D%0AErrPage%3A%20ERR_INVALID_URL%0D%0AErr%3A%20%5Bnone%5D%0D%0ATimeStamp%3A%20Tue,%2003%20Dec%202013%2010%3A10%3A36%20GMT%0D%0A%0D%0AClientIP%3A%20172.31.0.107%0D%0A%0D%0AHTTP%20Request%3A%0D%0A%0D%0A%0D%0A">webmaster</a>.</p>
>> <br>
>> </div>
>>
>> <hr>
>> <div id="footer">
>> <p>Generated Tue, 03 Dec 2013 10:10:36 GMT by localhost 
>> (squid/3.1.19)</p>
>> <!-- ERR_INVALID_URL -->
>> </div>
>> </body></html>
>
> Right. This makes it very clear: this is definitely an error message 
> coming from your proxy server (see "squid/3.1.19" at the end).
>
> This means that for some reason curl is using the proxy, despite the 
> "--proxy ''" option, and the $no_proxy environment variable. I suggest 
> you simply "unset http_proxy" before running the agent, and this 
> should then work. Can you confirm that is indeed the case?
>
> We can look into why this is happening in curl aswell.
>
> Jonathan
I've unset  http_proxy and no_proxy, stopped and start the agent, and 
launched :

  var/rudder/cfengine-community/bin/cf-agent -KI

The command is still curl -s -f --proxy '' ...

output :

R: 
@@Common@@log_info@@hasPolicyServer-root@@common-root@@00@@common@@StartRun@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#Start execution
Can't stat 
/var/rudder/share/7e00f086-0ead-4526-aed5-34490f24808f/rules/cfengine-community/rudder_promises_generated 
in files.copyfrom promise
R: 
@@Common@@result_error@@hasPolicyServer-root@@common-root@@00@@Update@@None@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#Cannot update 
node's policy or dependencies
R: 
@@Common@@result_success@@hasPolicyServer-root@@common-root@@00@@Security parameters@@None@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#The internal 
environment security is acceptable
R: @@Common@@result_success@@&TRACKINGKEY&@@Process 
checking@@None@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#There is an 
acceptable number of cf-execd processes (between 0 and 2) and cf-agent 
processes (between 0 and 5)
R: @@Common@@result_success@@hasPolicyServer-root@@common-root@@00@@CRON 
Daemon@@None@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#The CRON daemon is 
running
R: 
@@Common@@result_success@@hasPolicyServer-root@@common-root@@00@@Binaries update@@None@@2013-12-03 
12:32:13+01:00##7e00f086-0ead-4526-aed5-34490f24808f@#The CFengine 
binaries in /var/rudder/cfengine-community/bin are up to date
  -> Executing '/usr/bin/curl -s -f --proxy '' -o 
"/var/rudder/tmp/uuid.txt" http://172.31.0.61/uuid' ... (no timeout)
  !! Finished command related to promiser "/usr/bin/curl" -- an error 
occurred (returned 22)
cf-agent aborted on defined class "could_not_download_uuid"
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.rudder-project.org/pipermail/rudder-users/attachments/20131203/9298fe75/attachment.html>


More information about the rudder-users mailing list