James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
I am testing the performance of an application.
I am testing with 1 user, 25 users, and 50 users. I am capturing the response time for these.
Is there a way I can use these results to predict what the response time will be for up to 1000 users?
Thanks
|
Whittie
Member
Registered: 11th Aug 06
Location: North Wales Drives: BMW, Corsa & Fiat
User status: Offline
|
Lost me at application
|
Tommy L
Member
Registered: 21st Aug 06
Location: Northampton Drives: Audi wagon
User status: Offline
|
guestimation ?
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
For the purpose of simplicity. You see at the bottom of CS where they have 27 database queries in 0.0212212 seconds.
Thats what I am capturing.
|
Tommy L
Member
Registered: 21st Aug 06
Location: Northampton Drives: Audi wagon
User status: Offline
|
after a while you would see a pattern
|
Whittie
Member
Registered: 11th Aug 06
Location: North Wales Drives: BMW, Corsa & Fiat
User status: Offline
|
Why?
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
quote: Originally posted by Whittie
Why?
It's for uni, im investigating how caching and the amount of users affects the performance of a website, but I dont have the infrastructure to test with more than 50 users at once. So I need to make mathematical predictions.
|
Whittie
Member
Registered: 11th Aug 06
Location: North Wales Drives: BMW, Corsa & Fiat
User status: Offline
|
Ah ok.
|
Steve X16XE
Member
Registered: 31st Dec 06
Location: Barnsley, South Yorkshire
User status: Offline
|
1 = something x100 = X
25 = something x4 = xy
50 = something x2 = x
compair your answers then add them and devide by 3...........???
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
Don't really understand what you mean TBH.
|
Steve X16XE
Member
Registered: 31st Dec 06
Location: Barnsley, South Yorkshire
User status: Offline
|
For 1 person your answer is 3 (making these numbers up)
So 3 (what that one guy gets) x1000 = 3000
For 25 people you answer could be 78 (again making these numbers up)
So 78 x 40 (so it equals 1000 people) = 3120
For 50 people you answer could be 149 (again making these numbers up)
So 149 x 20 (so it equals 1000 people) = 2980
Now add them up...... 2980+3120+3000 = 9100
Now 9100 / 3 (coz you have done 3 sums) = 3033 on average for 1000 people.
So ignore the numbers, look at the workings Easy
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
oooohhh I see, does that technique have a name?
I did A-level maths but ive forgotten everything 
Thanks.
|
Steve X16XE
Member
Registered: 31st Dec 06
Location: Barnsley, South Yorkshire
User status: Offline
|
I got a C after my 2nd resit (GCSE that is) 
It's called "Jam Master Danger Steve"
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
Did you make it up?
|
Steve X16XE
Member
Registered: 31st Dec 06
Location: Barnsley, South Yorkshire
User status: Offline
|
No it's a propper thing. Remember "mean, median and mode?"
Something to do with that. :big smile:
|
John
Member
Registered: 30th Jun 03
User status: Offline
|
Do you not get most of your course based on doing maths like that?
Thats all the shit that was in my computing degree.
I don't know how to do it though because I hate maths.
|
Steve X16XE
Member
Registered: 31st Dec 06
Location: Barnsley, South Yorkshire
User status: Offline
|
I only liked Maths and Science. They were the only ones that i gat a "C" in.
Everything else was a "D".
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
I've not studied it at uni, but I do remember it vaguely from college.
If anyone can remember what this technique is called I will love them forever
|
MikeD
Member
Registered: 18th Aug 02
Location: Whittlesey, Cambridgeshire
User status: Offline
|
its the mean, add them all up and divide by the number of answers you have e.g. 3
|
Ian
Site Administrator
Registered: 28th Aug 99
Location: Liverpool
User status: Offline
|
If its web stuff can you not use a batch downloader to hammer it?
wget and a few batch files would do this?
Also depends massively on whether you have linear growth on your response times.
10 users may take 10 times as long to be served.
100 users may take 100 times as long
1000 users may cause the server to queue some requests and start using virtual memory, which would cause it to take much longer.
If you don't want it that complicated then its just a very simple y=mx+c straight line. Truth is it probably increases gradually at first, then steadies off as you get efficiency through caching, keep alives and the like, then increases sharply when you start swapping. Maybe some derivative of y=x^3
|
Eck
Premium Member
Registered: 17th Apr 06
Location: Lundin Links, Fife
User status: Offline
|
What he said ^^^
|
Ian
Site Administrator
Registered: 28th Aug 99
Location: Liverpool
User status: Offline
|
quote: Originally posted by Steve X16XE
No it's a propper thing. Remember "mean, median and mode?"
Something to do with that. :big smile:
Mean is the average of all the data. In a straight line graph, the gradient of the line.
You wouldn't need to faff about with different loadings as by definition you are saying there that load doesn't actually affect response in anything other than a linear way. So why bother wondering what the effects of lots of load are. It just raises your mean. Plus you don't have data for the high end.
Median is the middle value, which again is no good as you only have data from the lower end, so therefore only quick response times. Even factored up they're probably too quick to scale well. Median is only really useful for finding out the spread of the data, ie. are tall people or small people more common in a room full of people. ie. is your median closer to tall or small.
Mode is the most frequently occuring value. Depending on how many decimal places, you might not even have one of these. The CS page timer generates identical page times very rarely indeed given that its not really rounded. Therfore mode could be basically random anywhere it happens to happen. I wouldn't rely on this. You need discreet data for this.
In short, you'll need to try harder than that if you don't want to look like you guessed the answer from what little you remember about year 9.
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
Ian i'm using Microsoft Application Center Test to simulate the users, I can simulate as many users as I like other than the fact that its hosted on a local machine and it crashes under even a small load
|
Ian
Site Administrator
Registered: 28th Aug 99
Location: Liverpool
User status: Offline
|
Which is realistic.
So you need a big site on a big server? 
I've got data here might be useful for talking about larger scale stuff except because its live we don't generally ever reach the crash bit anyway. Normally spend some money on new gear before the daily average gets to silly 
What type of crash are we talking? Machine death or just big responses like you want?
|
James
Member
Registered: 1st Jun 02
Location: Surrey
User status: Offline
|
Well I've already developed the application that i'm testing, that was part of the project. Instead of hosting the application, I tested it on a local private network to minimise any extraneous variables.
Unofrtunately I could only run up to 50 users before I was getting .Net server too busy errors etc.
Thats why I was hoping to show the results I captured, and then go on to predict what the results might be for 200, 500 and 1000 users for example.
|