Hard Light Productions Forums
Off-Topic Discussion => General Discussion => Topic started by: Rick James on October 29, 2008, 05:57:34 pm
-
What the hell. (http://news.bbc.co.uk/2/hi/technology/7693993.stm)
I, personally, can't see this taking off. I like my data right where it is: on my freaking hard drive.
-
Google has similar plans
-
I welcome the ability to easily share things on the web, but if my connection goes down I want to be able to use my computer.
I refuse to store everything on the internet, although I'm sure groups like the RIAA would just love it if everybody's files were on one group of servers.
-
This sort of thing is stupid...Computers should all come with 4 hard drives.
2 for program and two data
-
Was talking about this in University today oddly enough. Digital Data security has proved to be ropey enough already. There's no way on Earth I would willingly choose to store personal or private data/information outside my control.
There's nowhere near enough faith in the ability of the IT sector to keep data secure, and Quantum Encryption is a long way from being a viable option that would allow anything of this sort to actually become viable, for a start, you'd have to replace all the copper networking in the world to optical.
-
Why would i wont to store evey thing on the internet sound like it would be easier to hack. but i could be wrong about that. and what if my internet goes down well i cant use my computer at all because everything stored on the Internet. it would be just one big paper weight until i get my internet back.
-
Hang on a moment, are they saying you'll be forced to store your data on the 'net, or is it simply an option? If the latter, I see no reason to get all worked up about this tech.
If the former, though, I agree with the general outrage.
-
Hang on a moment, are they saying you'll be forced to store your data on the 'net, or is it simply an option? If the latter, I see no reason to get all worked up about this tech.
The problem is that big business, and in particular consumer electronics, tends to be dictated by what the latest OS is. We won't be outright forced to adopt the new technology, but what happens if there comes a point when, for the sake of business or a desire to have the 'Windows Azure' to use a specific application, we have no choice but to adapt?
-
Armageddon comes.
EDIT: no, seriously.
-
No way I'm putting everything in that maze - mabye certain things that you put on file hosting sites but other than that, it is leading to a world where there is one massive leader thingy that monitors everyone, when they do the wrong thing...like in those stories...
-
Dear god... imagine how long it's take to load and save some of my psd files :eek2:
-
computing in general has become ridiculously dependent on the internet. going as far as having applications you cant leagally use without an internet connection (activation type setups). a computer should be what a computer is supposed to be. i always considered the internet just another addon. but lately if your run a computer the internet is practically a requirement.
this essentially points me to believe that the software companies want total control over everything we do as far as computing goes. they want to restrict us to certain formats that we cannot rely on, they want to know every program we run, and they want our money. thats what this is about. of course all the stupid people will flock to the new system not knowing that theyre being screwed in the ass. its total geekacide man.
-
And that sort of super interconnectedness is a little foolhardy I think. The Internet is important but we should never rely on just one technology. Its robust, its powerful, but its not totally impervious. Even if the Internet is still up but critical services and routing are down or compromised...then what happens. There needs to be good fallback options.
-
With WiMax on the horizon some possibilities exist but overall the architecture needs to change.
The magnetic RAM architecture would have to go mainstream so the OS architecture would pretty much exist like a BIOS does now. Booting up goes right back where you left off and programs run in their own space without ever writing to the structure of the OS. The energy consumption is cut dramatically and the size of the system is reduced as well.
New hardware standards leading toward open manufacturing portability as well would have to come into play to even give it a chance in h377 of success. Too many manufacturers are reliant on proprietary based tech that they really are insulting their customers and killing themselves off. At the same time the volitility of tech makes it difficult for the standards to even be set before obsolescence occurs. Some stability in those standards are the only hope software development has of actually producing stable environments as well.
-
What about sending or receiving files from whatever storage space is in existence? I've got some pretty big files on my computer right now. If they're located on some server and I need to, say, download a DVD image for a project I'm working on, how long will it take to download? The best plan my ISP provides (which I have) offers a max 300 kbps download speed; I can only reliably get half of that. I can't afford to wait nine hours or more.
With a hard drive, I can have the disk image right freaking there and I won't have to dick around with some server for hours on end. This technology assumes that we have internet technology a lot faster than what we have now. We do not. I don't imagine businesses would be willing to invest God knows how many millions of dollars to upgrade their internet connection and probably purchase all-new equipment from an untested OEM.
Azure needs to be killed and it needs to be killed now.
-
What about sending or receiving files from whatever storage space is in existence? I've got some pretty big files on my computer right now. If they're located on some server and I need to, say, download a DVD image for a project I'm working on, how long will it take to download? The best plan my ISP provides (which I have) offers a max 300 kbps download speed; I can only reliably get half of that. I can't afford to wait nine hours or more.
With a hard drive, I can have the disk image right freaking there and I won't have to dick around with some server for hours on end. This technology assumes that we have internet technology a lot faster than what we have now. We do not. I don't imagine businesses would be willing to invest God knows how many millions of dollars to upgrade their internet connection and probably purchase all-new equipment from an untested OEM.
Azure needs to be killed and it needs to be killed now.
Precisely.
-
The concept of "local copy" is not going to disappear anywhere. For files that are needed to be at hand reliably, local copy on hard disk drive or data crystal or whatever media is used is never going to be overtaken by remote storage. Same with local processing vs. remote processing... But, I can imagine remote storage and processing being useful in some cases - in fact, some things already use a setup like this.
Consider Opera Mini, a relatively popular phone browser software. What it actually does is that it connects to Opera's server, sends the url, which is then processed by Opera's server hardware into a form that is sent to phone, which then shows the data. It's advantages are that the phone doesn't need to do as much page processing, so the loading times are reduced (in theory anyway). Disadvantages are that it adds a third party to the connection so if you want secure connection, it might not be appealing to you.
Also, remote processing might provide an advantage for cases where the local processing capacity is limited...
But wait, is any of this actually new technology? SSH connection has existed for a long time already. The only thing that limits the usability of remote console and desktop solutions is transfer speed. Using X11 tunneling with SSH, I can already do this stuff on my university account; I can store data there and use any software installed on the university computers that I have access to. The reason why I don't do it is that with current internet connection, the GUI gets tacky. Data storage has come off useful on occasion - having a public_html folder is pretty handy for some limited file/image hosting.
But, as said, for confidential, private data and cases where reliability is needed, local copy will always be preferred solution. Remote processing... I can't think of applications where more processing power would be needed for normal users. Current PC's are not going to get any slower either, and professional/scientific simulation runs will be done on dedicated environments anyway.
So I kinda fail to see the point. All they are offering is a way for users to store casual, meaningless crap on their servers, which I can't understand to be honest.
Then again, if it proves reliable and secure (I have my doubts seeing how it's Microsoft, and Google is starting to give me shivers anyway...), it could be utilized as an SVN storage space for projects such as FSUpgrade or any mods/tc's with large enough staff for it to be meaningful.
-
The idea of "subscription applications" is just another way for companies to make more money.
-
+1 to Herra Tohtori for posting the first freakin' sensible post in this topic. The rest of you need to stop being so hysterical. I for one welcome use of internet based services as an option to local.
-
+1 to Herra Tohtori for posting the first freakin' sensible post in this topic. The rest of you need to stop being so hysterical. I for one welcome use of internet based services as an option to local.
As I said earlier, there's no way I would choose to store personal data on an Internet source until there is something like Quantum Encryption to keep it safe. So before you start accusing people of being hysterical, stop jumping to conclusions yourself.
-
Awwww come on I love being hysterical :p
-
Calm it people.
End of the day, we don't have to use it.
-
Agreed, I'm not angry, I just hate it when people generalise and start saying that only the people that they agree with are 'sensible', and that anyone who has a concern is 'hysterical' :p
-
Everyones got a right to have a point of view.
This topic seems a bit diluted now. My 0.05$ (no cent key on my keyboard :p) is that of the abovce post. I'll give this a day to re-organise into a civil discussion.
-
The first thing I saw was this line:
The aim is to allow developers to build new applications which will live on the internet, rather than on their own computers.
And I could only :mad: and :sigh: with :doubt:. And not in a 'minor' way too - more like that worrisome feeling of an impending apocalypse, etc etc etc and stuff of that sort...
Oh well...
-
Office servers do this already.
Microsoftys just proposing it on a lrager scale, eventually. I doubt it will be a forced concept though. Much like .swf over Java.
-
Well at that point in time when I saw the quote, I was thinking about "normal" software (i.e. the kind you pick up from a shelf, pay, and should own).
Yep that's about it :doubt: :sigh:
-
Essentially creating Thin Client / MainFrame based software is definately a long time proven technology.
Task specific applications can always work with this, but not the operating system in its entirety.
-
If you want to actually check out some cool cloud computing. Then go here (http://eyeos.org/en/) and check out eyeos. It's an internet operating system. The cool thing is that you can download it and install it on your own server as you wish.
Actually different from eyeos. Buying tons of computers with no hard drives and one uber powerful server hosting an os and have everyone boot from lan. One time setup with all of the computers setting up to boot from lan. Then, maintenance would only need to occur on the server hosting the os. Great for computer labs, net cafes, etc.
With eyeos what you could do is create your own server on your own network and connect to it from anywhere in the world from your web browser. Sort of like storing your files on the net. Except not, it's on your network. Yet again very similar to logging onto a computer via ssh and using a remote desktop viewer. Just that the web browser variant would only require a web browser. Something all computers have as opposed to ssh, vpn, vnc out of the box.
Internet os's and internet based software like google's browser based word processor is handy stuff. Especially if you go out of country. You can just use any computer and just login to your internet desktop from a browser. But, like everyone says, you don't want your data where you can't control it. I'd only ever store my data on a local computer or over the internet vpn'ing to my network.
-
I've worked in those thin client OS labs before, the productivity level was crap during class times due to volume.
-
The thing i'm not keen about with these, is that (if i understand correctly) doesn't the network speed determine how fast programs load effectively limiting processor potential?
-
Also the server's ability for connection pooling, data caching, client memory for the data volumes, and again the real main key element, bandwidth. Driving in reverse down a freeway is possible. Is it practical?
-
^No. but pretty cool!
-
The thing i'm not keen about with these, is that (if i understand correctly) doesn't the network speed determine how fast programs load effectively limiting processor potential?
No. How about the reverse of this. Does your fast connection to the net limit your processor speed and potential?
I've worked in those thin client OS labs before, the productivity level was crap during class times due to volume.
I guess a much more powerful server should have been used.
This type of os is part of the future of cloud computing and is already popular and in use. Mainly google putting this stuff to good use.
It's very funny how in this discussion because people don't like the idea of an internet os so the next explanation as to why is that servers = the suck.
-
The thing i'm not keen about with these, is that (if i understand correctly) doesn't the network speed determine how fast programs load effectively limiting processor potential?
No. How about the reverse of this. Does your fast connection to the net limit your processor speed and potential?
I've worked in those thin client OS labs before, the productivity level was crap during class times due to volume.
I guess a much more powerful server should have been used.
This type of os is part of the future of cloud computing and is already popular and in use. Mainly google putting this stuff to good use.
It's very funny how in this discussion because people don't like the idea of an internet os so the next explanation as to why is that servers = the suck.
It should be working only as fast as the data's coming. If the connection is slow, you're screwed, you can't calculate data you don't have.
Oh yeah, Remote Desktop Viewers are for pussehs. Terminal are teh win. :P
-
my guess is that you would get all the data then calculate it.
-
I guess a much more powerful server should have been used.
This type of os is part of the future of cloud computing and is already popular and in use. Mainly google putting this stuff to good use.
It's very funny how in this discussion because people don't like the idea of an internet os so the next explanation as to why is that servers = the suck.
It is determinant on where the load is passed, the applications scope of use, and the volume of data that can be passed.
If everyone is on a T-1 but the load is 10,000, what happens to the trunk?
-
Ummm... You guys realize that the Azure project is for Global Intranet purposes, right? As in PRIVATE network?
It's sad that so many reliable news sources seem to always get things like this wrong. What's even sadder is that their subscribers actually believe it.
Now excuse me, I have to become paranoid at the idea behind patenting page up and page down keys :rolleyes:.
Seriously. Get a clue.
-
LOL, I'll have to read more about it, I'm more excited about magnetic RAM and WiMax though.
-
Azure? Pfft, they've already got the Sky (http://skydrive.live.com) and I'm using it to store my FreeSpace stuff.
Besides, they've got a lot of Lives running parallel to each other already (http://en.wikipedia.org/wiki/Windows_Live#Windows_Live_services). :rolleyes:
-
The thing i'm not keen about with these, is that (if i understand correctly) doesn't the network speed determine how fast programs load effectively limiting processor potential?
No. How about the reverse of this. Does your fast connection to the net limit your processor speed and potential?
If i#'m playing FS on a low speed conn and it has to load all the sound textures .pofs etc, it'll take a damn age for the game to load is what i meant :p <example>
-
If all of the game loading happens on your computer, and the other persons computer. Then the only thing that would be slow is a high ping on like dialup or something. The game would load and run fine, you'd be afflicted with a high ping due to not having a big enough pipe on your internet. Any pausing and stuttering would also be from ping on a slow connection, or rather any connection to which a game server you connect to has a high ping.
-
If all of the game loading happens on your computer, and the other persons computer. Then the only thing that would be slow is a high ping on like dialup or something. The game would load and run fine, you'd be afflicted with a high ping due to not having a big enough pipe on your internet. Any pausing and stuttering would also be from ping on a slow connection, or rather any connection to which a game server you connect to has a high ping.
We were assuming you'd have to load the date from the 'master' server.