Author Topic: 2010 will not be the "GNU/Linux on the desktop" year  (Read 9472 times)

0 Members and 1 Guest are viewing this topic.

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
Re: 2010 will not be the "GNU/Linux on the desktop" year
I use Windows because it works with my programs. I only care about the OS in terms of how it runs programs, and I don't want the OS to be something that I "use" as such. :p Even if there was any advantage to Linux, I probably wouldn't switch because of the inferior program compatibility.

I have only used Linux on various university and work computers and they all had a variety of annoyances (chopped up fonts, permission settings and sudo needed for simple things, mouse cursor has to be in window to scroll with keyboard, etc.), but it's possible that they were just badly configured. My brother recently tried out Ubuntu, more as a fun exercise than anything else, and says that the basic OS install is fairly smooth but getting small programs working (he was trying Hamachi) is a huge waste of time, especially if you need to compile them.

From my point of view, the one thing Linux might be good for is running Wine. Wine may run some old games that don't work on modern video card drivers or DirectX versions, and there isn't really a working port of Wine for Windows right now.

Quote
Linux is perfectly fine for what it is. An OS for computer techies.

For all they say about this, my impression is that support for modern hardware is lacking, both from hardware companies as well as third party developers. As far as I know, there are no Linux versions or equivalents of Rivatuner, nHancer, the Logitech mouse and keyboard profile utilities, and many other small system tools that I rely heavily on. I believe things like SLI and CF are also spotty on Linux.

Quote
Speaking of the command line, while it's not the thing people like to learn, it is a very power tool.

I honestly don't see what CLI is good for in regular usage, given an always-open file manager program, a high sensitivity mouse and disabled GUI animations. I occasionally use it on the school machines because you seemingly have to in order to get simple things done. In Windows, I have only ever found it useful for special batch processing jobs (such as with nvDXT), which I might need to do once in 6 months. If I need to change some obscure OS or program setting, it's going to be somewhere in the registry or in some ini file.

 

Offline blackhole

  • Still not over the rainbow
  • 29
  • Destiny can suck it
    • Black Sphere Studios
Re: 2010 will not be the "GNU/Linux on the desktop" year
All realistic options left have been considered because they pay people to consider that

And sometimes those people are wrong.

Quote
Linux is perfectly fine for what it is. An OS for computer techies.

No, its a piece of sh!t that's on fire. And it's starting to stink. The requirement of a command line is the most idiotic things in linux and it is also the one thing that braindead community will not give up. I AM A PROGRAMMER AND I HATE THE COMMAND LINE. Linux isn't being accepted by the masses because the driver support is bull****, the GUI is crap, the whole thing doesn't work out of the box, and Ubuntu yells at you for using proprietary software, which is just as annoying as vista's security features! Not only that, but WINE isn't perfect and there are practically no games that work on linux, no thanks to the fact that openGL is a piece of crap in its own regard. The entire open source community right now is full of crap, crap, and more crap, and that's why its not getting anywhere. Does windows suck? Sure, windows sucks too, but it sucks less and in a more friendlier way, so people will default to the least-sucky piece of software.

For that matter, all software in existence except for possibly notepad++, Onenote and a few others, all suck horrendously in their own lovely ways. All we're doing right now is arguing about which version of Suck we want to use.
« Last Edit: November 30, 2009, 11:57:07 pm by blackhole »

 

Offline Pyro MX

  • 29
  • Frenadian
    • Pyro MX's Art
Re: 2010 will not be the "GNU/Linux on the desktop" year
Thank you blackhole.

I had a paragraph written here to reply. But I erased it all. You know why?

I am braindead, an asshole, and I'm full of crap. Since that has been established, I am therefore not qualified to program anything. And certainly not to discuss such matters. I'll just stop all contribution to whatever is remotely open-source since there is apparently no way that it is ever going better. All we do is bull****. Are our efforts are worthless. End of discussion.

Now go figure why things progress slowly and why things don't work the way you want while you continue to suck the motivation out of the developers who make the effort to make things better.

 

Offline blackhole

  • Still not over the rainbow
  • 29
  • Destiny can suck it
    • Black Sphere Studios
Re: 2010 will not be the "GNU/Linux on the desktop" year
I am braindead, an asshole, and I'm full of crap. Since that has been established, I am therefore not qualified to program anything. And certainly not to discuss such matters. I'll just stop all contribution to whatever is remotely open-source since there is apparently no way that it is ever going better. All we do is bull****. Are our efforts are worthless. End of discussion.

Now go figure why things progress slowly and why things don't work the way you want while you continue to suck the motivation out of the developers who make the effort to make things better.

You guys wanted to know why linux wasn't working. I told you. If that makes you depressed and emo, I'm sorry, but that's none of my concern. The issue of "how do we make open-source software not suck" is not under the topic of this conversation. You can hate my guts, I don't care, I'm trying to point out whats wrong, not make everyone like me. I will leave the compassion and love for people with souls.

 

Offline General Battuta

  • Poe's Law In Action
  • 214
  • i wonder when my postcount will exceed my iq
Re: 2010 will not be the "GNU/Linux on the desktop" year
Well, I think blackhole did a nice job of taking this thread out of adult discussion territory. I'll leave it open for now, but unless the adults start behaving like adults (i.e. blackhole), it's headed for a lock.

You have a responsibility to present your points in a civilized and respectful manner.

 

Offline Mars

  • I have no originality
  • 211
  • Attempting unreasonable levels of reasonable
Re: 2010 will not be the "GNU/Linux on the desktop" year
 As far as Linux becoming a standard desktop:

Does Chrome OS count?

 

Offline Spicious

  • Master Chief John-158
  • 210
Re: 2010 will not be the "GNU/Linux on the desktop" year
For that matter, all software in existence except for possibly notepad++, Onenote and a few others, all suck horrendously in their own lovely ways. All we're doing right now is arguing about which version of Suck we want to use.
I like how you missed a big part of open source software being open source so you can improve things that make you unhappy but clearly you're too busy being a "PROGRAMMER".

 

Offline Pyro MX

  • 29
  • Frenadian
    • Pyro MX's Art
Re: 2010 will not be the "GNU/Linux on the desktop" year
I think I'll rephrase what I said earlier, since I forgot to put up the [sarcasm] tags. I wasn't acting depressed or emo. I was merely citing your (blackhole) words and interpreting your argument.

What I was pointing out is, support doesn't come without users. And that is valid for both commercial and non-commercial. And you certainly won't see getting it better acting that way. And since there are not a lot of us, well, commercial support is sure slower to come by (in terms of hardware drivers or proprietary applications), if it ever comes at all. Is it the fault of the community or the OS itself? I doubt it. I think it's more a question of market share.

Now, for the CLI thing and how it can be used more than an "open file manager" :P, well it really depends on how you use your computer. You can use it or not use it at all, it's pretty much up to the user. For myself, I use it to manage pretty much everything system-wide. Installation, startup scripts, processes, configuration, etc. etc. For example, it will be faster for me to type "eix firefox" to search for a program instead of launching an UI like Synaptic or Yast or whatever software you use to manage your programs. That's more a question of taste, but the CLI is very useful for some people.

 

Offline blackhole

  • Still not over the rainbow
  • 29
  • Destiny can suck it
    • Black Sphere Studios
Re: 2010 will not be the "GNU/Linux on the desktop" year
I think I'll rephrase what I said earlier, since I forgot to put up the [sarcasm] tags. I wasn't acting depressed or emo. I was merely citing your (blackhole) words and interpreting your argument.

I know that. I was simply giving an equally absurd sarcastic response to your sarcastic response. But hey, you know what, if we're all just completely misinterpreting one another, how about I just run off before everything actually does explode.

 

Offline Mars

  • I have no originality
  • 211
  • Attempting unreasonable levels of reasonable
Re: 2010 will not be the "GNU/Linux on the desktop" year
As far as Linux becoming a standard desktop:

Does Chrome OS count?

Or we can keep fighting. I personally am curious.

  

Offline Mika

  • 28
Re: 2010 will not be the "GNU/Linux on the desktop" year
Quote
Because you don't understand the way they work doesn't mean they don't have any direction. You have to take into account that people have been using Windows for years - having a different thing in front of you is certainly uncomfortable for a certain period of time. And it is definitely hard to make the effort.

What I think has actually made Windows programs so successful is that there are cases where one doesn't even need to know much about the program before he uses it. In best cases, I have never needed to read the manual or the help file. Compare that to CLI interfaces in Linux world, no, I don't think the CLI "ease of use" counts. I used to be quite good at DOS, and I'm wondering what might be so wonderful that I would need to relearn the same thing all over again?
Code: [Select]
DEVICE=C:\DOS\HIMEM.SYS
DEVICE=C:\DOS\EMM386.EXE /NOEMS
DOS=HIGH,UMB
DEVICEHIGH=C:\DOS\MOUSE.COM
DEVICEHIGH=C:\DOS\KEYBOARD.SYS
...
And what a joy that all was!

I'm always little bit perplexed what is so important for programmers in getting programs running quickly by minimizing the human time (this especially plagues Linux groups). In the long run, it saves maybe one second from carefully iconized desktop. I can easily run parallel two different ray-tracing softwares, CAD software and MATLAB in the same monitor and access all the necessary stuff from shortcuts and taskbar. However, I'm all for getting those softwares to start fast once I have instructed the computer to do so. 2 seconds would be a good time, but the starting time of OpenOffice (and other softwares like CAD) is counted in minutes, rather than seconds. That is what I consider wasting of user's time.

Bottom line is that CLI is good for server use, but not very useful anywhere else. It is odd how little I have needed to do something like searching a file from command line (not at all) once I got to GUI. I admit that sometimes I do have considered some kind of batch to convert all .BMP files in directory to .PNG, but because there is only a couple of images it hasn't been worth the effort. There CLI could do it, but then again a little bit more energy focused on it would yield even more useful tool in GUI.

The bad sides of CLI (no need to do that in GUI, 'cause we have the CLI!) will swim up and bite the programmer in the ass once he starts to utilize specialized peripheral devices.

Quote
As for dependencies, you know there's an invention called package managers. While they are not unified in formats, they sure manage the dependencies well, whether it's a source or binary package manager. And you tend to forget the advantages. First of all, all the applications are maintained in a central place. You remove, add, update and reinstall every component on the same place. No need to have around 5 update managers for every single application. Secondly, if one or more programs share a library, it doesn't have to be re-installed for each application. Let's take the example of Firefox and Thunderbird. Both rely on xulrunner and proably a couple more libraries. When you install these two programs on a GNU/Whatever distro, both the programs share the same library. Install these two on Windows, and you'll install all the libraries twice.

A-ha! I knew it! I think placing all the libraries to same place would account to a fault in design (worse, in this case it is done with purpose). The system will not be fool proof. One cannot assume user has the required library already, or that the same library version is not tampered or updated. I don't care whether that 20 Kb of extra space required is used if it ensures that the program that I need to use works. If I would like to make sure that programs work and that there are no problems because of libraries, I'd do exactly as Microsoft has done.

However, updaters could really utilize Windows' own updater, but for some reason everybody seems to lean on writing their own ridiculously bloated and slow Java Virtual Machine updaters. Why can't the programs themselves check if there are updates and leave it at that?

Quote
Finally, it's not like Windows is entirely free of "dependencies" - while it certainly make less... visible use of them,  applications sometimes require different versions of the .NET framework. Also mentioned above, Freespace relies on OpenAL for the sound. Some OSS also install components like GTK+ or Python (when they're not directly bundled in the app itself). Also note that some software is available in distribution-independant packages (I'm not necessarily talking about compiling the source - some offer binaries with the dependencies inside).

Windows .NET Framework is really annoying. I wish they had never done that, but this kinda makes my point. I never end up having the right version for that particular software, and this actually lead me to delete the ****ing Framework from HDD and ban all programs from entering my computer that require it. Note that this is only one Framework / package / module whatever you call it. In Linux it usually turns out to be ten times worse.

Was that all I wanted to reply? Guess so for the moment.
« Last Edit: December 01, 2009, 02:09:49 pm by Mika »
Relaxed movement is always more effective than forced movement.

 

Offline castor

  • 29
    • http://www.ffighters.co.uk./home/
Re: 2010 will not be the "GNU/Linux on the desktop" year
Well, I think package management and dependency tracking is one of the strongest points in Linux.
I mean, when I was still using Windows, I just had to accept that sooner or later the install will rot to the point requiring a reinstall, and not much could be done about it. I hated that, it meant I couldn't actually *use* all the possibilities Windows GUI offers for personal adjustments and tweaks - not without wasting my time, since I'd need to redo them all again after awhile, and then again.. Same thing with performance tuning - takes too much time, considering the tweaks will be lost anyway.

 

Offline CP5670

  • Dr. Evil
  • Global Moderator
  • 212
Re: 2010 will not be the "GNU/Linux on the desktop" year
Quote
Now, for the CLI thing and how it can be used more than an "open file manager" :p, well it really depends on how you use your computer. You can use it or not use it at all, it's pretty much up to the user. For myself, I use it to manage pretty much everything system-wide. Installation, startup scripts, processes, configuration, etc. etc. For example, it will be faster for me to type "eix firefox" to search for a program instead of launching an UI like Synaptic or Yast or whatever software you use to manage your programs. That's more a question of taste, but the CLI is very useful for some people.

I was simply describing what I do myself. I always have a file manager open (the Windows explorer is useless, but that's what we have programs for), which is effectively my desktop, and I can zip around the Windows GUI quickly because of a fairly high mouse dpi setting and the fact that all interface animations are disabled. With that setup, all of the things you mention can be done just as quickly through GUI-based utilities.

I basically agree with what Mika said about this. You might like CLI out of personal preference, but there is very little actual advantage to it these days.

Quote
I mean, when I was still using Windows, I just had to accept that sooner or later the install will rot to the point requiring a reinstall, and not much could be done about it.

I'm on a 5 year old XP install and have no problems. It can work indefinitely as long as you maintain it the right way.

 

Offline Pyro MX

  • 29
  • Frenadian
    • Pyro MX's Art
Re: 2010 will not be the "GNU/Linux on the desktop" year
I don't think I actually said that GUIs should be replaced or dumbed down in favor of the CLI (if that is the case, then it's not what I've been trying to point out, believe me). I said they shouldn't drop it, that's all. Using GUIs myself most of the time, I don't know that I'd do without them. I still do like to manage stuff on the CLI. Is it how it should be done by everybody? No, since as it's been pointed out many times, there's a certain paranoia about it.

About dependencies, no, you cannot determine which dependencies each user has. This is why package managers exists - to pull down the necessary dependencies. And yes, it is possible to have more than one version of the same library installed on the same system. How the API changes across versions and how fast the changes are implemented, that depends of the library you use. But should you use any, keeping up with the APIs is a necessity. Don't like the libraries? Well build your own, nothing stops you from doing it. To package the program well is both the job of the developer and the package maintainer. If you don't like that whole system, well, have the dependencies bundled within your application and distribute it that way (I know Firefox, songbird, Flash and Skype who does it that way). Is it more complicated? Well, with my experience I'd say it has been easier for me to setup all my stuff using the same centralized service. Developer-side, I have access to way more libraries "out-of-the-box" than I'd have with Windows if I have do develop stuff. Is it the case of all developers, however? Probably not. But which system is better? It depends what you're working with. A bad system in itself? I don't believe so.

I'm not quite sure everybody can use Window's updater (enlighten me if I'm wrong here). I think that the Microsoft Update service is available to only Microsoft and some others who have the right to distribute updates on it (I sometimes saw some peripheral drivers popping in). I'd be really surprised to see Firefox come up in MS Update  :lol:. But is Windows' update mechanism itself available to all applications? It would sure help clean up all these updaters.

There's something I's like to point out too. When you buy a computer from your everyday computer store, they are not designed nor built to support OSes other than Windows. So of course, there's a risk that the hardware on it could not work. That situation is nearly absent if you buy a Mac, because they control the hardware their OS runs on. So it's practically impossible to buy a Mac that doesn't work out of the box. But ask for a computer setup that has hardware known to work on Linux? Unless you choose the parts yourself, good luck. And that is a problem. And it's not like there's no hardware out there that is supported by the operating system itself.

Speaking of Skype, I stopped using it recently so switch to Ekiga and a real phone. I don't know what happened at Skype, but they literally left the program without any updates, not even bugfixes, for about a year and a half. I don't know how many people are working on the Linux version of the client or how they do their development, but they certainly did not managed it very well. And on any program, especially if you pay for the service, you expect the company to deliver the goods. And Skype simply didn't. They also ended up on a pretty bad time to let the app dead. Pulseaudio seemed to make their stuff crash on some distros. Was it the fault of the distribution or Skype? Difficult to say (I certainly would have waited more before integrating the Pulseaudio technology into distributions), but they didn't seem to put effort to work with the community either. As for myself, running on a 64bit system, I had to use 32bit versions of the alsa libraries, which worked, but what annoyed me was the apparent lack of interest from Skype to pull up their sleeves (which, I think, they finally done now). It worked, but for what I paid for, I got less than the Windows or Mac counterparts. That drew may away from the service. What I did? Went up with a service using the SIP protocol and chose the programs and hardware myself. Works perfectly well, I can communicate with everybody, and have all the features I need regardless of the operating system I use. Did I ask anybody to make their software open? Did I need to contact support to setup my stuff? No.

I run a couple of commercial programs on my GNU/Linux box. Flash, the Nvidia driver and VP UML (which I no longer use, because I didn't quite like the way it worked regardless of the OS). I also use to run Adobe Reader, which also worked (but I had a better equivalent). Nvidia drivers runs flawlessly, and while I won't brag about the performances of Flash, it also works. When I was using Ubuntu, I got Wine working and played some of my games with it, but when you have a Windows partition for that matter, well, no need to duplicate the environment  :lol:. I also need the Windows platform to test if my stuff works on both OSes. The point is, it is possible to make good cross-platform software. Whether it's a library or a program. Webkit, for example, is being used by Apple, Google, KDE and other applications related to the Web. OpenGL is used by ID software (and our very own SCP), and they make pretty decent games. More recently, World Of Goo was released on the Wii and PC on Windows/Mac/Linux.

Is commercial support lacking? Sure, there's no denying for that. Having the Adobe's creative suite run on Linux would sure be a blast. Having more games developed on it? Anytime. Having more 3D modeling tools, better drivers, better support, of course! But that thing doesn't come with magic, and certainly not by leaving the boat and continuously hammering on the assumption that developing on the GNU/Linux platform is impossible. People complain there's no commercial support of any sort, but it seems way easier to point fingers at the OS itself instead of making efforts. Is it easy to do such a leap? Heck no. It can be freaking hard. Especially if what you previously develop relies on libraries that are not cross-platform and that you don't have the expertise to do such a thing. But the community itself cannot do it all by itself. And I certainly won't blame it or the distributions whatsoever for the generalized lack of commercial support. Are the distributions flawless? Neither. But there is a need for cooperation between the two, and I don't think it's happening very much.

And finally, I can work indefinitely on my GNU/Linux install as long as I maintain it the right way. Just like I can on Windows XP.

 

Offline Woolie Wool

  • 211
  • Fire main batteries
Re: 2010 will not be the "GNU/Linux on the desktop" year
Package managers would work much better if (a) they were standardized across all distributions (Windows does not have two competing installer formats each only compatible with certain versions of Windows), and (b) independent developers actually used the damn things instead of releasing everything as source. A lot of people making software for Linux still think it is reasonable to expect users to compile it. And until that is no longer the case Linux will not have a chance of breaking MS's domination.

Only tinkerers and developers need or care about source releases. The average user does not want to compile your software! He expects to download it, click on it, and watch it go.
« Last Edit: December 01, 2009, 06:07:46 pm by Woolie Wool »
16:46   Quanto   ****, a mosquito somehow managed to bite the side of my palm
16:46   Quanto   it itches like hell
16:46   Woolie   !8ball does Quanto have malaria
16:46   BotenAnna   Woolie: The outlook is good.
16:47   Quanto   D:

"did they use anesthetic when they removed your sense of humor or did you have to weep and struggle like a tiny baby"
--General Battuta

 

Offline Pyro MX

  • 29
  • Frenadian
    • Pyro MX's Art
Re: 2010 will not be the "GNU/Linux on the desktop" year
And just out of curiosity, did you have to rely on compiling almost everything you needed in order to run your system? Or you stumbled upon exceptions?

 

Offline Woolie Wool

  • 211
  • Fire main batteries
Re: 2010 will not be the "GNU/Linux on the desktop" year
Who said anything about simply "running my system"? Who cares if my system "runs" if it can't do what I want it to do without having to compile software, and its dependencies, and its dependencies' dependencies? There's a lot more to a computer than the core system components. I guess if all you want to do is surf the web you won't ever have to touch gcc but I do a lot more than that with my computer.
16:46   Quanto   ****, a mosquito somehow managed to bite the side of my palm
16:46   Quanto   it itches like hell
16:46   Woolie   !8ball does Quanto have malaria
16:46   BotenAnna   Woolie: The outlook is good.
16:47   Quanto   D:

"did they use anesthetic when they removed your sense of humor or did you have to weep and struggle like a tiny baby"
--General Battuta

 

Offline Pyro MX

  • 29
  • Frenadian
    • Pyro MX's Art
Re: 2010 will not be the "GNU/Linux on the desktop" year
That's not what I meant. I ask if you had to compile a rather large number of application or essential stuff without which your system would be non operational. The reason I ask this is simply because I hear this often, often from people who never touched the operating system, or used a distro 5 years old. And you'll understand that I have a tendency to believe that things are often way generalized on this subject.  Pardon my skepticism.

I've been running on Linux for about 5 years, and except that I run now today a source distribution (which means everything is compiled via a source package manager, nearly the same as with a binary distribution, but with sources), my last 2 years with Ubuntu and my short experience with Fedora required little or no compilation at all. I too, do much more than browse the Web. It'd be easier to say the only thing I don't do with my box is gaming. Right now on my system, the only thing "compiled by hand", is fs2 open.

 

Offline castor

  • 29
    • http://www.ffighters.co.uk./home/
Re: 2010 will not be the "GNU/Linux on the desktop" year
Quote
I'm on a 5 year old XP install and have no problems. It can work indefinitely as long as you maintain it the right way.
I always thought that software installs on Windows are scary. Because you never know what the installer will actually do before trying it out.
And the more software you have installed, the scarier it becomes - if a particular install has a side effect on something already installed, you won't find it out until you run the software (maybe months after the offending install was done - so you can't even guess the cause it broke anymore).

I don't know how XP manages these issues, though, maybe it's saner than it's predecessors.

 

Offline Mika

  • 28
Re: 2010 will not be the "GNU/Linux on the desktop" year
Quote
I'm on a 5 year old XP install and have no problems. It can work indefinitely as long as you maintain it the right way.

Seconded^3. I have had three computers running Windows XP for five years, two at work and one at home. At work the user support continuously updates stuff (even the stuff that is totally unnecessary, like MovieMaker, since I never use them) but the machines still work for years. I don't know what is it that causes the whole thing to break apart with some people and not with some people. I would have thought continuous updates would break the thing apart, but that doesn't appear to be the case.

Quote
I'm not quite sure everybody can use Window's updater (enlighten me if I'm wrong here). I think that the Microsoft Update service is available to only Microsoft and some others who have the right to distribute updates on it (I sometimes saw some peripheral drivers popping in). I'd be really surprised to see Firefox come up in MS Update  . But is Windows' update mechanism itself available to all applications? It would sure help clean up all these updaters.

Don't bet on it: http://www.microsoft.com/presspass/features/2009/jul09/07-20linuxqa.mspx
I recall having read that Microsoft originally intended the Windows Updater to be used this way, as it is with Windows Installer. I don't know at which point did the functionality change, or if it has changed. I have found out that the least annoying updaters in Windows world are those that are bundled with the start up of the program, given that the updates happen around four to five times / year maximum. No extra updaters are needed for that stuff.

Quote
Don't like the libraries? Well build your own, nothing stops you from doing it. To package the program well is both the job of the developer and the package maintainer.

Yeah, as I said before the effect of Linux tends to be that the responsiblity of hardware manufacturer goes to the software developer. We were not expecting the need to write our own libraries (nor was there money or time allocated for that in budget), as the hardware vendor was supposed to that. We expected to be able to work and further develop the hardware that the library and driver controlled.

In our world the small saving of the hard disk space means nothing if it causes the program not to work.

Quote
Having more 3D modeling tools, better drivers, better support, of course!


Call it a hunch, but I don't expect to see many 3D modelling tools launched for Linux. This is reiterating my first post, but my feeling is that these software programmers usually tend to think their software is worth of some money, and will not work for free. General Purpose License is not always a good thing. This is especially the case in my working area, none of the softwares related to my work are available for Linux.
Relaxed movement is always more effective than forced movement.