Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

php process stays running

  • 31-01-2009 10:12am
    #1
    Registered Users, Registered Users 2 Posts: 21,611 ✭✭✭✭


    I'm having a problem with with a php script. that I use for sending webtexts on the O2 site. The script is attached for reference.

    The problem is that after a while the server it's running on gets overloaded because there are dozens of the processes running. What I think is happening is that the website is so slow, the cURL execution gets "stuck" and so the script never returns and sits there forever. That or people get sick of waiting and cut it off before it's finished, again meaning it never returns.

    Does anyone know how to force the script to exit after a certain amount of time? I'm thinking something involving threads but I'm not sure what to do


Comments

  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    You can set parameters for the cURL connection using curl_setopt()

    One of these allows you to set a timeout for the connection, after which curl will return with an error, and you can gracefully kill your script.

    You could even set it to say 15 seconds and then return a message to the user saying that the O2 website isn't responding.

    If your webserver is also running apache, I would recommend setting a max execution time for your PHP scripts also (maybe five minutes), so that any PHP script which executes longer than this will be automatically killed.


  • Registered Users, Registered Users 2 Posts: 21,611 ✭✭✭✭Sam Vimes


    seamus wrote: »
    You can set parameters for the cURL connection using curl_setopt()

    One of these allows you to set a timeout for the connection, after which curl will return with an error, and you can gracefully kill your script.

    You could even set it to say 15 seconds and then return a message to the user saying that the O2 website isn't responding.

    If your webserver is also running apache, I would recommend setting a max execution time for your PHP scripts also (maybe five minutes), so that any PHP script which executes longer than this will be automatically killed.

    thanks for that, i never knew that was an option. i'll give it a try and see how it works


  • Closed Accounts Posts: 12,382 ✭✭✭✭AARRRGH


    You could stick something like this at the top of your script:
    set_time_limit(1500);
    


  • Registered Users, Registered Users 2 Posts: 21,611 ✭✭✭✭Sam Vimes


    AARRRGH wrote: »
    You could stick something like this at the top of your script:
    set_time_limit(1500);
    

    i just had a look at the setting for max execution time on the server and it's 300 seconds. that might cause the problem if someone cuts off before the script has returned so when it tries to return after 300 seconds it can't and so stays running. i'll give that a shot too. thanks


Advertisement