Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

LoadRunner & LightStreamer

  • 20-03-2013 2:52pm
    #1
    Closed Accounts Posts: 3,981 ✭✭✭


    Hi there,

    We have an application which uses LightStreamer to stream data. It does this over HTTP.

    I'm trying to performance test this application with LoadRunner.

    I have two URLs:
    URL 1 creates a session id and stays open, this URL will continue to load
    URL 2 uses this session id to make requests. The response of each request will appear on URL1's page.

    The issue I am having is that when I make the request for URL 1, I am unable to make subsequent requests while keeping this request open. I tried using the web_concurrent wrapper, but that creates issues because before I can make the request to URL 2 I need the session id which is returned from URL.

    Getting the ID isn't an issue, executing a request while the first request is still running is the issue.

    Has anyone successfully tested LightStreamer with LoadRunner before?

    This was the only thing I could find online whereby someone else was doing the same thing: http://www.sqaforums.com/showflat.php?Number=685960

    Unfortunately he no longer works with our company. I tried reaching out to him on LinkedIn but I have not had much luck.

    Any help would be greatly appreciated!


Comments

  • Closed Accounts Posts: 3,981 ✭✭✭[-0-]


    Progress!
    web_reg_save_param("p_sessionid",
                      "LB=SessionId:",
                      "RB=\r",
                       LAST);
    web_reg_save_param("p_sessionid",
                      "LB=SessionId:",
                      "RB=\r",
                       LAST);
    	web_url("create_session.txt", 
    		"URL=<URL TO CREATE SESSION>", 
    		"TargetFrame=", 
    		"Resource=0",
    		"RecContentType=text/html", 
    		"Referer=", 
    		"Snapshot=t1.inf", 
    		"Mode=HTML",
    		EXTRARES,
    		"URL=<URL TO REQUEST DATA WITH SESSION ID>", 
    		ENDITEM,
    		LAST);
    
    

    My next step is to figure out how I parse the responses while this function continues to run ad infinitum.

    The responses look like this:
    ction.c(7): t=5506ms: 7-byte response body for "URL" (RelFrameId=1, Internal ID=1)
    Action.c(7):     PROBE\r\n
    Action.c(7): t=6607ms: 59-byte response body for "URL" (RelFrameId=1, Internal ID=1)
    Action.c(7):     1,1||03:46:29 PM|||22:$215.4200|$$1.9800||||||2,357,903||\r\n
    Action.c(7): t=7609ms: 49-byte response body for "URL" (RelFrameId=1, Internal ID=1)
    Action.c(7):     1,1||||||||$$215.4100|5|$$215.4500||2,358,003||\r\n
    

    The first response will always be 1,1, if I make subsequent requests in the EXTRARES piece, then they will be 2,1 and 3,1, all the way to n,1.

    I need some way of trying to measure the latency here between making the initial request and getting the response. Tricky - does anyone have any ideas?


  • Closed Accounts Posts: 3,981 ✭✭✭[-0-]


    I also need a way to end web_url after 6 minutes. Using web_set_timeout("STEP",600) ends it in an error which is not what I am after.


  • Closed Accounts Posts: 3,981 ✭✭✭[-0-]


    [-0-] wrote: »
    I also need a way to end web_url after 6 minutes. Using web_set_timeout("STEP",600) ends it in an error which is not what I am after.

    This is not possible.

    Turns out I'm just going to leave the streaming sessions open for an extended period of time. I'll turn on verbose logging and parse the response time and updates separately. Better than nothing! Just an FYI in case anyone runs into this issue in the future.


Advertisement