Boards.ie uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Click here to find out more x
Post Reply  
 
 
Thread Tools Search this Thread
09-01-2019, 16:13   #1
nacho libre
Registered User
 
Join Date: Aug 2006
Posts: 12,072
IBM

https://www.wunderground.com/cat6/IB...ecasting-Model
nacho libre is offline  
Advertisement
09-01-2019, 16:21   #2
gabeeg
Registered User
 
Join Date: Jan 2013
Posts: 554
Every hour?

That's me sacked
gabeeg is offline  
09-01-2019, 16:53   #3
Reckless Abandonment
Registered User
 
Reckless Abandonment's Avatar
 
Join Date: Aug 2011
Posts: 636
Get ready for the 11z 14z 17z
Just wouldn t be the same.
But if it does what they say it will,
could be a game changer.. looks very interesting..
Reckless Abandonment is offline  
Thanks from:
10-01-2019, 07:35   #4
sryanbruen
Registered User
 
sryanbruen's Avatar
 
Join Date: Aug 2015
Posts: 9,998
I hope it doesn’t become publicly accessible just for the sake of the thoughts of it coming in the hands of hypers on forums like Netweather. It’s bad enough as it is.
sryanbruen is offline  
10-01-2019, 10:16   #5
Akrasia
Registered User
 
Akrasia's Avatar
 
Join Date: Jan 2004
Posts: 12,636
That's really cool.

What makes it even cooler is that the model uses Quantum Computing on the new IBM Q System One platform.
it operates at 20 Qbits. This is something that has been available as a cloud service from IBM since 2017 but now they're selling commercial units to private companies and universities and if the 20qbit service can see improvements as dramatic as this, remember that the first quantum computers were running at 5qbits a couple of years ago, they've already commercialised 20qbits and are working on 50qbit prototypes.

This is another explosion in processing technology that will make ultra fine resolution weather models possible in the medium term, and with better resolution for real time monitoring, that makes longer range forecasting less inaccurate (although chaos still applies and small pertubations can still have exaggerated consequences)
Akrasia is offline  
Advertisement
10-01-2019, 12:27   #6
Gaoth Laidir
Registered User
 
Gaoth Laidir's Avatar
 
Join Date: Dec 2015
Posts: 3,474
It seems like a bit of a marketing stunt by IBM, to be honest.

I'd have major problems with including mass data from e.g. smartphones and private stations, as these would be useless in many cases. Take the case of a smartphone barometer. Assuming it is perfectly calibrated (I know mine is a couple of hPa off), the very act of driving in a car causes major fluctuations in the reading depending on the car's speed and whether the window's open or not. But even apart from that, driving along and rising even 4 metres (about the limit of the accuracy of the phone's GPS) will reduce pressure by around 0.5 hPa, therefore the potential error due to GPS altitude and barometer errors is huge compared to a standard station. We already have plenty of those on land, it's over the oceans and in more isolated locations that we need more. The only thing worse than no data is bad data. Garbage in, Garbage out. 4DVAR has greatly improved the assimilation process of the global models.

The fact that the resolution is non-uniform around the globe is nothing new. The ARPEGE is basically the ECMWF (~9 km) run with the highest resolution (~7.5 km) centred over France, reducing down to ~37 km on the opposite side of the globe (antipode). It's similar with this new model. Up to 3 km over land but much coarser over oceans. This limits the usefulness of its longer range forecasts, so a standard hi-res local model will do the same job for shorter-term forecast of smaller systems.
Gaoth Laidir is offline  
(2) thanks from:
10-01-2019, 13:19   #7
Akrasia
Registered User
 
Akrasia's Avatar
 
Join Date: Jan 2004
Posts: 12,636
Quote:
Originally Posted by Gaoth Laidir View Post
It seems like a bit of a marketing stunt by IBM, to be honest.

I'd have major problems with including mass data from e.g. smartphones and private stations, as these would be useless in many cases. Take the case of a smartphone barometer. Assuming it is perfectly calibrated (I know mine is a couple of hPa off), the very act of driving in a car causes major fluctuations in the reading depending on the car's speed and whether the window's open or not. But even apart from that, driving along and rising even 4 metres (about the limit of the accuracy of the phone's GPS) will reduce pressure by around 0.5 hPa, therefore the potential error due to GPS altitude and barometer errors is huge compared to a standard station. We already have plenty of those on land, it's over the oceans and in more isolated locations that we need more. The only thing worse than no data is bad data. Garbage in, Garbage out. 4DVAR has greatly improved the assimilation process of the global models.

The fact that the resolution is non-uniform around the globe is nothing new. The ARPEGE is basically the ECMWF (~9 km) run with the highest resolution (~7.5 km) centred over France, reducing down to ~37 km on the opposite side of the globe (antipode). It's similar with this new model. Up to 3 km over land but much coarser over oceans. This limits the usefulness of its longer range forecasts, so a standard hi-res local model will do the same job for shorter-term forecast of smaller systems.
Millions of datapoints can be used by machine algorithms to generate a very 'accurate' picture despite inaccuracies on individual recording devices. Think of it like the human brain. Our brain can generate a stable image even when individual photons are phase shifted to different colours, or reflected diffracted and diffused off different surfaces etc and there are changing shadows and light pollution from different sources, and each of our eyes gets a different image which arrives upside down and has to be inverted and collated in our brain to generate a single image, and in that image, noise is filtered out and specific parts of the image get amplified based on internal rules that decide which information is more useful or important depending on the context. We do this about 30 times a second for our entire waking lives.

The smartphone data could be used to compute anomalies per geographic datapoint linked to each individual smart phone, so if person x has a phone that consistently reads higher than usual at a certain location, if that user travels this route every day, the algorythm can assign that barrometer reading an ID and only report based on the anomaly for that device in that location. Take the average anomaly for thousands of different devices, exclude the extreme outliers and give weightings to datasources for which there is more consistent data, and you can get a picture out of the noise.

Of course, that takes enormous amounts of processing power, but that's the point of this machine. it can take trillions of pieces of data each day and use it in it's model

We have already seen how social media data analysts can take users personal data and build profiles of each user (billions of people) that can identify their personality and preferences better than people who they are in daily contact with can.

With smartphone sensor data, it's taking only a couple of basic location and sensor datapoints and logging them. It's a much less complex problem than identifying what people are like based on which posts they share or pictures they like (although still very complex)

Anyway, a system like this would use decentralised unverified data to augment and support it's service which would be primarily based on verifiable station networks rather than as a primary data source, at least until there are suitable studies to verify the accuracy of that data.
Akrasia is offline  
(3) thanks from:
10-01-2019, 13:53   #8
Sparks
Moderator
 
Sparks's Avatar
 
Join Date: Apr 2003
Posts: 38,091
Quote:
Originally Posted by Akrasia View Post
Millions of datapoints can be used by machine algorithms to generate a very 'accurate' picture despite inaccuracies on individual recording devices.
But that's (a) not what the new quantum machine they're selling is designed for; (b) not usually true. The norm the vast, vast, vast majority of the time is "garbage in, garbage out".

Quote:
Think of it like the human brain. Our brain can generate a stable image even when individual photons are phase shifted to different colours, or reflected diffracted and diffused off different surfaces etc and there are changing shadows and light pollution from different sources, and each of our eyes gets a different image which arrives upside down and has to be inverted and collated in our brain to generate a single image, and in that image, noise is filtered out and specific parts of the image get amplified based on internal rules that decide which information is more useful or important depending on the context. We do this about 30 times a second for our entire waking lives.
So, that's not how photons work, that's not how human eyes work, that's not how the human visual system works insofar as we currently understand it (at all) and we don't have a single static "refresh rate" on our visual system (it's more distributed than that) but we can detect things that are visible for only 16ms which would give an approximate rate of 60hz for detection (but we can tell the difference between scenes changing at 60hz and scenes changing at 100hz and at higher frequencies, as well as doing a bunch of things that directly imply varying "refresh rates" for different parts of the system, so it's not really a useful number for characterising the human visual system).

Quote:
Of course, that takes enormous amounts of processing power, but that's the point of this machine. it can take trillions of pieces of data each day and use it in it's model
That's not really the point of quantum computing. It's not so much about being able to handle large amounts of data, it's about being able to tackle specific kinds of calculations which with existing machines are prohibitive in cost (where cost is in terms of time rather than money). NP problems, basically. Going from "we can't calculate this within the lifetime of the known universe" to "we can probably calculate this within a finite time".

In terms of handling large amounts of data, the approaches being developed for monitoring experiments in CERN or the Square Kilometer Array is a lot more advanced because they're looking at more data (CERN generates about a petabyte of data per second during LHC runs and the SKA will be around an exabyte of data per day).

Quote:
We have already seen how social media data analysts can take users personal data and build profiles of each user (billions of people) that can identify their personality and preferences better than people who they are in daily contact with can.
That is not the same thing and is also not really true, though it was a very popular clickbait headline a while back.
Sparks is offline  
10-01-2019, 14:00   #9
gabeeg
Registered User
 
Join Date: Jan 2013
Posts: 554
Quote:
Originally Posted by Sparks View Post
That is not the same thing and is also not really true, though it was a very popular clickbait headline a while back.
Eh Brexit
gabeeg is offline  
Advertisement
10-01-2019, 14:05   #10
Calibos
Registered User
 
Join Date: Feb 2002
Posts: 7,123
Quote:
Originally Posted by Sparks View Post
But that's (a) not what the new quantum machine they're selling is designed for; (b) not usually true. The norm the vast, vast, vast majority of the time is "garbage in, garbage out".



So, that's not how photons work, that's not how human eyes work, that's not how the human visual system works insofar as we currently understand it (at all) and we don't have a single static "refresh rate" on our visual system (it's more distributed than that) but we can detect things that are visible for only 16ms which would give an approximate rate of 60hz for detection (but we can tell the difference between scenes changing at 60hz and scenes changing at 100hz and at higher frequencies, as well as doing a bunch of things that directly imply varying "refresh rates" for different parts of the system, so it's not really a useful number for characterising the human visual system).


That's not really the point of quantum computing. It's not so much about being able to handle large amounts of data, it's about being able to tackle specific kinds of calculations which with existing machines are prohibitive in cost (where cost is in terms of time rather than money). NP problems, basically. Going from "we can't calculate this within the lifetime of the known universe" to "we can probably calculate this within a finite time".

In terms of handling large amounts of data, the approaches being developed for monitoring experiments in CERN or the Square Kilometer Array is a lot more advanced because they're looking at more data (CERN generates about a petabyte of data per second during LHC runs and the SKA will be around an exabyte of data per day).


That is not the same thing and is also not really true, though it was a very popular clickbait headline a while back.
What does literally everyone have now that doesn't move that is guaranteed to be connected to the internet??

A Wifi Router.

Get manufacturers to install the required hardware and have an opt out instead of opt in for sending the data back to base.
Calibos is offline  
10-01-2019, 14:13   #11
gabeeg
Registered User
 
Join Date: Jan 2013
Posts: 554
Quote:
Originally Posted by Calibos View Post
What does literally everyone have now that doesn't move that is guaranteed to be connected to the internet??

A Wifi Router.

Get manufacturers to install the required hardware and have an opt out instead of opt in for sending the data back to base.
I don't have a wifi router. I've been using my phone as a hotspot for the last year or so.
Reason being is that I typically get 20mbps+ which is faster than your average wifi router.

This is more of a money saving tip than an argument against what you're suggesting
gabeeg is offline  
10-01-2019, 14:21   #12
Sparks
Moderator
 
Sparks's Avatar
 
Join Date: Apr 2003
Posts: 38,091
Quote:
Originally Posted by Calibos View Post
What does literally everyone have now that doesn't move that is guaranteed to be connected to the internet??

A Wifi Router.

Get manufacturers to install the required hardware and have an opt out instead of opt in for sending the data back to base.
That would be a tremendously terrible idea from the point of view of computer security. A single back door, in every single wifi router in the world (and I don't know how you'd even arrange that - just look at the current problems Huawei are having in the US in regard to security concerns), shipping data to a single central spot from your devices. That would be a nightmare to try to do securely, and I'm not sure the payoff is even well-defined, let alone proven.

There are things that might look like this (like, say, LoRa sensor nets) but those are different in key aspects (like being opt-in, having dedicated sensors and devices and hubs, having dedicated channels and so on).

Last edited by Sparks; 10-01-2019 at 14:24.
Sparks is offline  
10-01-2019, 23:37   #13
Akrasia
Registered User
 
Akrasia's Avatar
 
Join Date: Jan 2004
Posts: 12,636
Quote:
Originally Posted by Sparks View Post
But that's (a) not what the new quantum machine they're selling is designed for; (b) not usually true. The norm the vast, vast, vast majority of the time is "garbage in, garbage out".



So, that's not how photons work, that's not how human eyes work, that's not how the human visual system works insofar as we currently understand it (at all) and we don't have a single static "refresh rate" on our visual system (it's more distributed than that) but we can detect things that are visible for only 16ms which would give an approximate rate of 60hz for detection (but we can tell the difference between scenes changing at 60hz and scenes changing at 100hz and at higher frequencies, as well as doing a bunch of things that directly imply varying "refresh rates" for different parts of the system, so it's not really a useful number for characterising the human visual system).


That's not really the point of quantum computing. It's not so much about being able to handle large amounts of data, it's about being able to tackle specific kinds of calculations which with existing machines are prohibitive in cost (where cost is in terms of time rather than money). NP problems, basically. Going from "we can't calculate this within the lifetime of the known universe" to "we can probably calculate this within a finite time".

In terms of handling large amounts of data, the approaches being developed for monitoring experiments in CERN or the Square Kilometer Array is a lot more advanced because they're looking at more data (CERN generates about a petabyte of data per second during LHC runs and the SKA will be around an exabyte of data per day).


That is not the same thing and is also not really true, though it was a very popular clickbait headline a while back.
Well, I got served
Akrasia is offline  
11-01-2019, 00:42   #14
pistolpetes11
Moderator
 
pistolpetes11's Avatar
 
Join Date: Sep 2008
Posts: 6,791
IB M had a stand at CES today but upon asking a few very basic questions it was clear there was no answers to be had

All looked great , will be interesting to see the results

Attached Images
File Type: jpg IMG_1510.JPG (2.43 MB, 327 views)
pistolpetes11 is offline  
11-01-2019, 01:16   #15
Gaoth Laidir
Registered User
 
Gaoth Laidir's Avatar
 
Join Date: Dec 2015
Posts: 3,474
Quote:
Originally Posted by pistolpetes11 View Post
IB M had a stand at CES today but upon asking a few very basic questions it was clear there was no answers to be had

All looked great , will be interesting to see the results

Without knowing if there were slides previous to that one, I see they conveniently left out the fact that the ECMWF is down to 9-km resolution. Not able to answer basic questions. It's an idea in its infancy stage, unproven as yet but still being marketed as the dog's ball locks. I see the same in my line of work. Unfounded claims of greatness.

And for that reason, I'm OUT!
Gaoth Laidir is offline  
(2) thanks from:
Post Reply

Quick Reply
Message:
Remove Text Formatting
Bold
Italic
Underline

Insert Image
Wrap [QUOTE] tags around selected text
 
Decrease Size
Increase Size
Please sign up or log in to join the discussion

Thread Tools Search this Thread
Search this Thread:

Advanced Search



Share Tweet