Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Thoughts on (k)ubuntu

  • 06-02-2008 1:11am
    #1
    Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭


    Well having spent a solid week experimenting with kubuntu I have the following thoughts:

    I debated installing linux for a while but was eventually prompted by using my Win2k install for a while and being wholly impressed at how much more responsive it was than XP - it reminded me exactly why I had put off installing XP for so many years. Given that Windows is only getting more bloated with time , and 2k is EOL, I thought I'd give linux a bash and see if it allowed me to ditch windows altogether, since I really don't do enough gaming any more to justify it.

    My expectations of the OS were simple:
    a) It should be stable
    b) it should be faster and leaner than my XP install.
    c) It should be able to give me web, torrents, media playback, and DVD creation without wasting a lot of my time.

    Not a big bunch of demands.


Comments

  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    The Live CD was full of promise. The graphical install was painless, easy to follow and promised good things. Not everything worked, but I didn't expect it to. Enough things worked well for things to have a very encouraging start. I could see myself using Linux every day.

    I'll compare my experience as clean "out of the box" installs in both cases - this means that all I'm comparing is the OS themselves and not the third-party addons that "complete" the OS and make it do what you want. The pros in this case is that it's pure developer vs. developer and I'm not accepting any excuses for something being below par. If it's below par, it shouldn't be part of the default install. The devs control their OS releases, they have nobody to blame but themselves. The cons are there'll be whinging I'm not seeing how "really powerful" OS <blah> is because I'm not using the obvious improvements.

    First Install experience:

    Linux: Nice, simple, clever graphical installer. The installer knew what needed to be done, asked me the questions it needed and got on with its job. Great start. A.

    XP: Text based installer, followed by a graphical installer, followed by a reboot and hardware detection. Not as clever as the linux installer. Tedious start. D

    First usage impressions:

    Linux: First things first - open up the file manager. Causes a crash. Not auspicious. Third party drivers crashing, I can accept. 64bit code I can give allowances for. The built-in file manager crashing mysteriously on a clean install? That's just sloppy. When I do get it up and running, it turns out to be the single worst file manager I've ever had the misfortune of using. Looks pretty and shiny, but fails in the "letting me manage my files" and "finding out where the **** stuff is" aspect of life which I find more of a pressing concern. F.

    XP: Tedious "welcome to windows" crap, but at least it's bypassable, and if I was clueless it at least lets me know how to work the thing. Things seem to do what you expect them to do, and there are no crashes. Boring, but obvious and isn't wasting my time. B+.

    Second usage impressions:
    Linux: No Wifi detected, my USB adaptor is dead to the world. No access to my existing drives. I expected this, so no big deal. Being told that I have to enter "administrator mode" to change the display resolution (clearly a massive security concern) makes me wonder how paranoid the devs really are. Finding that the display resolution doesn't change after I do enter the highly privileged and secure administrator mode (by clicking a button) makes me think someone is taking the piss. Since the OS is even suggesting that it has an alternative closed-source driver I can use for the hardware, I don't buy the explaination that it doesn't know how to work my graphics card and monitor. It knows the model of my wireless mouse, and could even tell me the battery level - if I had sufficient access. Which apparantly I don't and it can't tell me why since it seems to think I should. I could look it up online, but I have no Wifi and the built in help doesn't have anything useful like a troubleshooting section. Since the info I need is in the "Official Docs" on the website, I don't buy that this is because they don't have it written up and I'm pretty sure the text could _just_ about be crammed on a full DVD image.
    D (loses major points for being unnecessarily stupid and wasting my time when it doesn't have any excuse to.)

    XP: No Wifi again, but the drivers are right in front of me so no problem there. (Now that linux is installed, no access to Linux drive, but I expected that too.) Security Center isn't happy I don't have anti-virus installed, but hasn't got any suggestions what might be done about it. Nothing is going majorly wrong so far, but on the other hand everything's still slower than I'd like. B.

    Third Usage Impressions (Fiddling):
    Linux: Let's get installing that third party ATI driver that's part of the official distro to improve performance. Easy install, just click the "make it go" button. Installing again impresses. Unfortunatly it Black Screen Of Death's after it reboots and I wipe the partition. At least with a Blue Screen of Death you get a memory dump so Windows can pretend it knows what happened. On the one hand, third party driver, so it might not be the OS's fault. On the other hand, it's included in the standard install and if it's that buggy it shouldn't be there. E.

    XP: Installing the various drivers goes smooth for the Wifi but again not for the graphics. At least it doesn't trash my whole install, but I do need to remove the driver and make a second attempt at installing. Still not working properly as motherboard drivers aren't properly installed - have to reinstall those first but don't have to redo the whole windows. Found some AV software after a bit of googling and forum help. Not obvious if I didn't already know where I could look and MS still isn't about to make any suggestions. D-.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    Follow on:
    Linux:
    GRUB decides that since it can't find the linux partion, it will just freeze up and not let me boot anything else or hand over control to the windows boot loader. Since it knew when it installed, and when it ran normally, there's two windows installs and another boot loader in the MBR, this is either lazy coding or just pigheadedness - don't take control from the existing OS if you're not prepared to hand it back when something goes wrong. Either way, I call it what it is and give it the finger. NG for being a total cnut.

    2nd installation gets me back to square one. Follow the docs for enabling NTFS and Wifi support, which involve me installing ntfs 3g (easy enough) and going through a very picky command line system for using ndiswrapper NTFS drives are now showing up, but it isn't mounting them. It also won't mount the USB pen drive i've downloaded all the drivers and wrappers etc onto, even though it knows the mounting points and give me the option of automatically mounting them. Still no Wifi. A different doc tells me I need a clean install and unplugging the adaptor to get that working properly. Off to the re-installer again. C-.

    XP: Still too slow. Switching off some of the graphical effects solve that. I decide to leave them on since it makes it look like Win2k and I already have that installed. Everything pootling along fine and while it's not interesting I'm getting what I want done. Decent tradeoff. B.

    Linux 3rd install:
    I go for the text installer this time and while not pretty and shiny is still simple, clever, and does everything it allegedly needs to first time. Text installer seems to produce a more stable install (no file manager crashes this time, and NTFS drives are working ok with no futher effort on my part). I vow to use this rather than the graphical installer in future. It loses points for being unnecessarily confusing about partitioning, and demanding a swap partition rather than just reserving space on the root partition for reasons best known to some mountain dew addict. I don't care where it swaps its files, and if I did, I don't see why it can't give me a choice rather than demanding a new partition. B+

    I go through the longwinded process of getting wifi up and running (and finding that some of the files it wants me to edit don't actually exist in this install)., only to find that 64bit support in 3rd party software isn't what I need it to be. I chalk this up to over-ambition and download the 32bit install instead. B.

    Linux 4th install:
    Install goes smooth and things are looking up. Until I find that Wifi DOES work out of the box if you hit the "make it go" button again and again and ignore what the official docs tell you.To describe me as pissed off would be an understatement. I start downloading all my third party apps from its genius "trusted install" system. Only the "add and remove programs" system tells me it's already running, can't fix itself, and can't tell me what's wrong, other than maybe I've run out of space, permissions, or other unknowns. According to the forums this is because Adept is widely regarded as **** and they should be using Synaptic instead. I try Synaptic and while it works, it's also a very bad UI experience. I can see why the devs want to go with adept instead, but it would be nice if they "fixed" it first. When your "add/remove software" system is fundamentally broken, then there's something wrong with your focus. E.

    First few 3rd party apps install fine, but I'm not thrilled that I don't get a choice where things are installed. Evidently the OS knows best, although when I run out of space on this partition I'd like to be sure that it won't just tell me "yah boo sucks to be you I ain't installing that. I don't care if you've got another 300 gigs free over there". More important, having installed apps (and their officially-approved backend addons) for DVD and media file support, I find that while the apps have DVD playback enabled, they don't actually play DVD's, ever. Or most media files, at least not without at least a couple of re-installs. I decide that the whole "trusted source" system is evidently not a "quality or tested software" resource at all, but more of a "Eh, those guys told us it was fine, whatever." system. This makes it no more useful (in fact less useful) than download.com.

    I unpackage, repackage, compile, install, reboot, reboot again to get Radeon support up and running. Although the NTFS drives are mounting, it still won't show them on the desktop - until it feels like it, which means sometimes they're there, sometimes they're not. Since that's where I want to see them I expect the OS to obey my commands (not a very complicated wish seeing as it's a checkbox in a right-click menu). E.

    At this point I sat back and took stock. While I fully expected to have to do a lot more work to get Linux up and running, and I didn't spend nearly as much time in the terminal compiling **** as I expected, I'm still not happy.
    I quite like using the OS, but I don't appreciate the amount of my time the OS has wasted to no good purpose. (emphasis important)
    I won't accept "Official Docs" where the makers of the OS aren't even sure what's supported in the standard install and what isn't, or where the info is contradictory or hopelessly outdated. They do commercial support, so the info is there somewhere.
    I also won't accept that the helpful faq's they do produce on the website can't be slapped into the help system on the DVD image, given that they know they have problems getting network access up and running - if the hideous bloatfest that is Vista will fit on a DVD, I'm pretty sure a few html files can be squeezed in somewhere. I didn't expect the teary-eyed hand holding you get from a Mac. I don't see any reason why the useful help that could be given out isn't.
    There appears to be no truth in my experience that "open source" == "quality source". Maybe it has the potential to be, but potential isn't the same as actuality. The levels of buggyness I encountered were really of the sort that this shouldn't have qualified as RC, never mind Stable Release.

    So the experiment ends tomorrow. I didn't dislike it at all. But it just didn't provide anywhere near the level of payoff in terms of advantages over what I was using before to have to tolerate even half the buggy, time-wasting crap i did end up dealing with. Maybe in another couple of revisions I'll come back to it. Assuming they fix adept. And a lot of other things.


  • Registered Users, Registered Users 2 Posts: 590 ✭✭✭bman


    The pros in this case is that it's pure developer vs. developer and I'm not accepting any excuses for something being below par. If it's below par, it shouldn't be part of the default install. The devs control their OS releases, they have nobody to blame but themselves. The cons are there'll be whinging I'm not seeing how "really powerful" OS <blah> is because I'm not using the obvious improvements.

    You're not accepting any excuses for something below par!! Well, you'll have to accept them unless you want to go write the code yourself. Ubuntu (or any Linux) developers have no obligation to you at all. People with this attitude shouldn't even be given the privilege of installing the OS, never mind using it.

    By the way, at this stage I've almost given up on reading your "review" but decide to amble on for a bit longer.
    Unfortunatly it Black Screen Of Death's after it reboots and I wipe the partition.

    Your a lost cause. Give up now and go back to Windows (not that there's anything wrong with Windows. It's fine for people that cannot think for themselves).
    At least with a Blue Screen of Death you get a memory dump so Windows can pretend it knows what happened.

    There was some problem with the third party driver and you could easily have had a look at the log files by going to a terminal. You don't need a GUI to get to the terminal. So in fact you probably had a very in dept log file indicating where the problem might be rather than a load of hex numbers that give you no indication of what's wrong (what you get in Windows).

    I didn't read your last third post so there's probably loads more crap in there that someone else can address.


  • Registered Users, Registered Users 2 Posts: 2,534 ✭✭✭FruitLover


    You must have some pretty interesting hardware, as I've yet to see Ubuntu crash :confused: Might be a KDE vs Gnome thing.
    I'm not thrilled that I don't get a choice where things are installed

    You do if you install manually.


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    Wow. Sounds like you had a ****ty experience. For what it's worth I did this reinstall buzz a few times when I started out with linux. That's just because I didn't know enough.

    The important thing is though:
    If you're installing ubuntu / kubuntu it should work to some basic level of functionality out of the box. Windows is like this. A base install works, but is more or less useless until you install the proper video drivers etc.

    I'll offer you this advice: After installing, get yourself automatix. It installs all the good stuff for you. video drivers, flash, mp3 support, dvd support, java for your web browser, skype, etc. etc. etc.

    For a file manager, I use "thunar". It's actually from the xfce window manager. It's closer to windows explorer and very lightweight. I love it. I've attached a screenshot.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    bman wrote: »
    You're not accepting any excuses for something below par!! Well, you'll have to accept them unless you want to go write the code yourself. Ubuntu (or any Linux) developers have no obligation to you at all. People with this attitude shouldn't even be given the privilege of installing the OS, never mind using it.

    A great attitude to have. Are you a comp. sci student?

    There's a very important thing that I think gets missed out on by nerd culture. Using a computer is not a privilege, or something that anyone should feel grateful for. The computer is a tool. At the fundamental levels of logic, it is a very very brainless tool. People should no more feel "grateful" for the opportunity to use this tool than they should feel grateful for the opportunity to use a hammer, or a power saw. The factories that produce computers are filled with people no more intelligent than the factories that produce power saws.

    As to whether Linux devs have an obligation to me, the end user, I fail to see where you get the idea that they don't. This culture of "Hey, use at your own risk" as far as software development goes is not going to last forever. If they release code, I expect it to do what it says it can do. If it can't, I'm not going to apologise to them, I'm going to tell them their code doesn't work. If they've been lazy, or shoddy, then I'm not under any obligation to spare their blushes. they're not under any obligation to release an OS. I'm not under any obligation to use it. If they want me to use it (and the devs of kubuntu certainly do) then yes they have an obligation to see that their OS is actually functional.
    By the way, at this stage I've almost given up on reading your "review" but decide to amble on for a bit longer.

    That puts my mind at ease.
    Your a lost cause. Give up now and go back to Windows (not that there's anything wrong with Windows. It's fine for people that cannot think for themselves).

    Thank you for the pointless abuse. Post reported.
    Whether the partition goes missing as a result of a user deleting it, or a hardware error, the result would be the same: If the partition can't be found, GRUB packs up and refuses to let me use one of my other installs. That's simple, fundamental, bad coding. There's no reason why it couldn't pass control of the boot process over to the other loader it knows is there. Why wasn't it coded to do that?
    There was some problem with the third party driver and you could easily have had a look at the log files by going to a terminal. You don't need a GUI to get to the terminal. So in fact you probably had a very in dept log file indicating where the problem might be rather than a load of hex numbers that give you no indication of what's wrong (what you get in Windows).

    You assume I didn't know there would be a log file?
    My point was very clearly made - the driver was at fault (exactly the same hardware performed fine under the same OS with a different driver). Window's BSOD has been a geek joke for a very long time - yet here's a 2007 OS which can't even throw up a BSOD exception screen. And this isn't a random driver I pulled off the web - it's a driver that is part of the default install. If the driver is in that bad a state it shouldn't be part of the install in the first place - the devs had a choice as to whether it was there or not.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    Khannie wrote: »
    Wow. Sounds like you had a ****ty experience. For what it's worth I did this reinstall buzz a few times when I started out with linux. That's just because I didn't know enough.

    I wouldn't describe it as a ****ty experience. I expected it to be a lot more high-maintenance than it was, and I didn't have to do nearly as much work as I expected. My problems with it were all along the lines of poor quality control rather than architectural issues.
    The important thing is though:
    If you're installing ubuntu / kubuntu it should work to some basic level of functionality out of the box. Windows is like this. A base install works, but is more or less useless until you install the proper video drivers etc.

    My point exactly, thank you. There are plenty of distro's I could have used that aren't expected to work out of the box. This is one that is presenting itself as a viable, "functional" alternative, and it simply wasn't the case. Both OS's won't perform to their best untill you fiddle with the install and streamline everything. But both should work out of the box.
    I'll offer you this advice: After installing, get yourself automatix. It installs all the good stuff for you. video drivers, flash, mp3 support, dvd support, java for your web browser, skype, etc. etc. etc.

    For a file manager, I use "thunar". It's actually from the xfce window manager. It's closer to windows explorer and very lightweight. I love it. I've attached a screenshot.

    This is true, but I deliberately steered away from that for the reason I stated above:
    The OS is either "ready to go", or it isn't. While I could well spend another 2 weeks researching and fiddling to get good functionality out of it, the simple truth is that XP was giving me good functionality within 1 day. Maybe in future the ubuntu devs will include automatix and thunar, or synaptic, but the OS has to stand on its own two feet to be judged properly. If anything, the quality of third-party "replacement" apps only serves to show up the shoddiness of the default ones, and that sort of choice reflects badly on the OS.

    I'll repeat: I didn't have a crap experience, and I wouldn't slate Linux as a platform as a result. What I did find was that the base quality of the install was simply not RTM status. I'd regard it as Beta code. That's not to dismiss the work that the coders put into it - it's just an honest reflection of what state the code was in.


  • Registered Users, Registered Users 2 Posts: 2,082 ✭✭✭Tobias Greeshman


    Third Usage Impressions (Fiddling):
    Linux: Let's get installing that third party ATI driver that's part of the official distro to improve performance. Easy install, just click the "make it go" button. Installing again impresses. Unfortunatly it Black Screen Of Death's after it reboots and I wipe the partition. At least with a Blue Screen of Death you get a memory dump so Windows can pretend it knows what happened. On the one hand, third party driver, so it might not be the OS's fault. On the other hand, it's included in the standard install and if it's that buggy it shouldn't be there.

    Well ATI support under Linux has always being somewhat troublesome, NVIDIA support has always been much better. What you installed was probably the closed source, third party driver provided by ATI. I do believe it states that it is NOT guaranteed to work and no support is given. Not ubuntu's fault, and it expects the users who install it to have the competence to decide whether to install it or not. The choice is not forced upon you.

    As for the BSOD in windows giving you the memory dump at the time of the crash, well you can see the problem in either the system log or the xorg log file on ubuntu. Both normally located at /var/log/... Assuming you have the knowledge to make sense of the data??


  • Registered Users, Registered Users 2 Posts: 590 ✭✭✭bman


    A great attitude to have. Are you a comp. sci student?

    No.
    There's a very important thing that I think gets missed out on by nerd culture. Using a computer is not a privilege, or something that anyone should feel grateful for. The computer is a tool. At the fundamental levels of logic, it is a very very brainless tool. People should no more feel "grateful" for the opportunity to use this tool than they should feel grateful for the opportunity to use a hammer, or a power saw. The factories that produce computers are filled with people no more intelligent than the factories that produce power saws.

    Yes, a computer is a tool. But it is a pretty useless tool without the software that developers write to run on it! They are doing it for free so that anyone that wants to use it can. And the fact that you don't feel grateful to them for letting you use the "tool" (which lets face it, would be of no use to you without an OS) without charging you anything astounds me. Even if you hated the OS and never looked at it again, you still have no reason to blame the people that make it possible.
    As to whether Linux devs have an obligation to me, the end user, I fail to see where you get the idea that they don't. This culture of "Hey, use at your own risk" as far as software development goes is not going to last forever. If they release code, I expect it to do what it says it can do. If it can't, I'm not going to apologise to them, I'm going to tell them their code doesn't work. If they've been lazy, or shoddy, then I'm not under any obligation to spare their blushes. they're not under any obligation to release an OS. I'm not under any obligation to use it. If they want me to use it (and the devs of kubuntu certainly do) then yes they have an obligation to see that their OS is actually functional.

    The stuff they put in there is obviously working for someone or else it wouldn't be in there in the first place, so the problem is that you can't get the software to work.

    Thank you for the pointless abuse. Post reported.
    Whether the partition goes missing as a result of a user deleting it, or a hardware error, the result would be the same:

    The partition very likely did not "go missing". I doubt very much that installing a piece of graphics software messed up your partition tables. Also, in your original post you said you wiped the partition. You did not say the OS wiped it (which I would find very hard to believe anyway).
    If the partition can't be found, GRUB packs up and refuses to let me use one of my other installs. That's simple, fundamental, bad coding. There's no reason why it couldn't pass control of the boot process over to the other loader it knows is there. Why wasn't it coded to do that?

    Also, at this stage there is no "other loader". GRUB is on the MBR, not the windows boot loader. You should only need one boot loader. At this stage if you've deleted the Linux partition and you haven't made a separate /boot partition (which you should have done if you had researched what you were at) then grub cannot boot because it has no file telling it what to do.
    You assume I didn't know there would be a log file?
    My point was very clearly made - the driver was at fault (exactly the same hardware performed fine under the same OS with a different driver). Window's BSOD has been a geek joke for a very long time - yet here's a 2007 OS which can't even throw up a BSOD exception screen. And this isn't a random driver I pulled off the web - it's a driver that is part of the default install. If the driver is in that bad a state it shouldn't be part of the install in the first place - the devs had a choice as to whether it was there or not.

    I am not making a joke about the BSOD. I am simply stating that if you had looked around you would have found the log file. If a similar situation happened in Windows you would have a load of hex to figure out. It is not a joke, it is a fact. I know which I'd rather try to sort out.

    You probably had to configure the drivers config files for your machine. The devs cannot predict what machine you are going to use, you can't be handed everything on a plate.


    Maybe you just had a bad time with Ubuntu. All these issues are solvable. Try something else if you weren't happy with the experience. I suggest PCLinuxOS. I'll had two laptops run that and it picked up all hardware first time around on both. Absolutely no setting up to be done on either (bar screen res and simple things like that).


  • Registered Users, Registered Users 2 Posts: 1,227 ✭✭✭stereo_steve


    Khannie wrote: »
    I'll offer you this advice: After installing, get yourself automatix. It installs all the good stuff for you. video drivers, flash, mp3 support, dvd support, java for your web browser, skype, etc. etc. etc.

    Originally Posted by Slutmonkey57B The OS is either "ready to go", or it isn't. While I could well spend another 2 weeks researching and fiddling to get good functionality out of it, the simple truth is that XP was giving me good functionality within 1 day. Maybe in future the ubuntu devs will include automatix and thunar, or synaptic, but the OS has to stand on its own two feet to be judged properly. If anything, the quality of third-party "replacement" apps only serves to show up the shoddiness of the default ones, and that sort of choice reflects badly on the OS.

    Windows doesn't have native support for any of those things listed either. :confused::confused::confused:

    Seriously though, I really dislike KDE and it would have put me off linux if it was the first display manager I used. I do like GNOME alot though and because of this I find ubuntu much much better than Kubuntu. I have gone from a windows only user to exclusive ubuntu, laptop and PC. Even my mother and sister re using it on the home PC with no complaints.

    In short, give it a bit longer, you will really love it.


  • Advertisement
  • Closed Accounts Posts: 413 ✭✭sobriquet


    I'll start with the main point I want to make. Do you intend informing the Ubuntu/Kubuntu people of your experience? You don't appear to be looking for specific solutions, and after spending so much time informing random boardsies about it, I can't imagine you'd be opposed to relaying the experience to the relevant people who can do something about it. (Include your hardware configuration if you do.)
    pure developer vs. developer and I'm not accepting any excuses for something being below par. If it's below par, it shouldn't be part of the default install.
    I can see why, and you're perfectly justified in doing so, but I don't think it qualifies as pure developer vs. developer. When it comes to Windows, hardware manufacturers go out of their way to ensure support and compatibility. Linux distributors have to take on the burden of trying as much as they can to make any drivers supplied compatible. As it happens, since AMD bought ATI, they've commited to releasing their specs and allowing complete development of open source drivers. Intel just recently began the same process. The driver situation for both of those will improve considerably as a result.
    Linux: First things first - open up the file manager. Causes a crash. Not auspicious. [...] single worst file manager I've ever had the misfortune of using.
    On this point, why did you go with Kubuntu? It's a 'flavour' of Ubuntu, and I don't know but it might not get as much widespread testing as regular Ubuntu does. You mention later on that Synaptic should come as standard, being more stable than Adept (which I've never heard of) - it does, on regular Ubuntu.
    Linux: Let's get installing that third party ATI driver that's part of the official distro to improve performance. Easy install, just click the "make it go" button. Installing again impresses. Unfortunatly it Black Screen Of Death's after it reboots and I wipe the partition.
    Ok, black screen of death? My NVIDIA card, since installing the proprietary drivers, causes a strange problem that when booting it powers off the monitor (ie, no signal from the GFX card) - its' fixed when I modify a boot option in Grub. (It's been documented and apparently a fix is in the works.) Hideous, yes, but a complete wipe and reinstall isn't called for. (I'm playing STALKER on windows these days, if I reinstalled every time it BSODed on me...!) I'll point out here that one of the things I like more about using Linux is the recoverability. When something used to go badly wrong on Windows (pre-2k), it meant wholesale reinstalling, Linux invariably offered a less drastic approach. This means terminal usage though, so ymmv.
    unnecessarily confusing about partitioning, and demanding a swap partition rather than just reserving space on the root partition for reasons best known to some mountain dew addict.
    Linux uses a seperate partition for its swap space, whereas windows uses a pagefile on its' root. That's the way it's designed, and as far as I can see can't be described as anything other than 'different.' You might want to consider that Linux developers, from corporate employed kernel hackers to weekend hobbyists, might actually have sound reasons for some of these decisions, before throwing out the ad hominems.
    I start downloading all my third party apps from its genius "trusted install" system. [...] evidently not a "quality or tested software" resource at all
    I think you misunderstand the point of centralized package management. The point is not inherently about trustworthiness or stability. They are obviously going to be high on any distributions priority list, but the reason for the existence of the repositories and tools like apt is quite simply dependency management.

    As used to be the case on windows with 'DLL hell', trying to resolve interdependencies between packages (programs, libraries, header files, etc) and different versions of packages is troublesome. Systems like apt (or yum, or portage, or the BSD ports trees) are schemes to overcome those problems.
    First few 3rd party apps install fine, but I'm not thrilled that I don't get a choice where things are installed. Evidently the OS knows best
    As with the swap space issue, Linux is different. Packages are split across different parts of the file system hierarchy - executables, libraries, header files, and content go into different places. This is for historical reasons. It has it's problems but I can't say its' ever caused me any trouble. BSD, Solaris and other UNIX-alikes share this historical baggage, Apple ditched it for OSX. Each has their problems and benefits.
    Maybe in future the ubuntu devs will include automatix and thunar, or synaptic, but the OS has to stand on its own two feet to be judged properly. If anything, the quality of third-party "replacement" apps only serves to show up the shoddiness of the default ones, and that sort of choice reflects badly on the OS.
    As stated above, Synaptic is default, though obviously not on Kubuntu. The reason Kubuntu (and Xubuntu, Edubuntu, Medibuntu...) exist is because one mans 'replacement' package is another's sane default.

    You're correct though, it should be judged on its' merits. You obviously had a pretty bad experience (certainly an awful lot worse than the last time I installed Ubuntu).

    A friend of mine tried installing it recently on his MacBook, and it went not perfectly but better than yours, to my surprise. Then on his desktop, it broke in ways I didn't think possible. He's given up, and I don't blame him. Yours and his situations are real, and there's real place for criticism in the Ubuntu/Linux world (though again, it's best directed at the relevant Ubuntu folk). Plenty of us don't want to exist in an echo-chamber.

    Nevertheless, I think you'd be better served by laying off the ad hominem attacks and acknowledging that it's actually pretty unlikely that Linux developers in general are mountain dew addicted paranoiacs who don't care that their software doesn't work with your machine.


  • Registered Users, Registered Users 2 Posts: 1,421 ✭✭✭Steveire


    Are these kind of threads getting more common?

    The executive summary is that you might fare better if you try again in a week or two with the ubuntu cd, not the kubuntu one. It is better tested(could do with auto installing drivers/codecs/etc) and has more fallback features (get a safe-mode instead of Black Screen).

    Maybe even try linux mint. I think it's not distributed from the US so is not as legally restricted.

    Also you need to ask for help from people. Use IRC for real-time help if you don't have someone who can sit beside you and show you what to do when you hit issues.
    If it's below par, it shouldn't be part of the default install. The devs control their OS releases, they have nobody to blame but themselves.

    This is only true to a point. There are legal reasons that some useful things can't be put on the ubuntu cd.
    On the other hand, it's included in the standard install and if it's that buggy it shouldn't be there.

    It's not in the standard install. It's an additional unsupported driver with a warning.

    I recommend you give up for now and try again with a ubuntu cd in a week or two. I prefer kubuntu myself, but it's not the primary focus of *ubuntu development and testing. I'd give a ubuntu cd to someone instead.

    I also went through a few re-installations after hitting issues because I didn't know how to fix or describe the problem, and it was the easy way out. I've done the same with Windows too. Real-time help will get around that though. In reality there are few reasons you'd actually need to reinstall the OS to get around issues if someone knows what they're doing.

    I experienced the same problem as you with the black screen of death. Luckily I had enough experience already to do ctrl+alt+f1, log in, and edit the driver section of /etc/X11/xorg.conf to use the driver that came with the cd (I don't remember the names of which is which). The place where using the ubuntu flavour is that you'd have some kind of fallback (https://wiki.ubuntu.com/BulletProofX) which you could use to fix it instead of a black screen. It's probably like windows safe-mode, though I haven't used it to know.
    D (loses major points for being unnecessarily stupid and wasting my time when it doesn't have any excuse to.)

    I don't really know what the comment refers to exactly. What is stupid?

    Regarding clearing the partition and expecting it still to work, that's not an OS issue either. It's down to how harddrives work as far as I know. You could have installed ubuntu without grub, and used the Windows boot.ini file instead. Then if you removed Windows you wouldn't have been able to use ubuntu. A part of the hard-drive points to either the grub files or the boot.ini. Remove either and things can't boot.


    So, once you're over the frustration of it, try again with the ubuntu cd. It gets far far more testing. Even wait until April and try the 8.04 version. Also get help from people who know how to troubleshoot issues that come up. This is also true of Windows. You've probably got a Windows savvy friend that helped out with Windows problems. Or maybe you're that guy already and keeping other people from having to buy a new computer.

    If no one in the know is around and you're using a windows computer, try using firefox with the chatzilla add-on. Connect to the freenode server and join the ubuntu channel and get help there.


  • Registered Users, Registered Users 2 Posts: 354 ✭✭AndrewMc


    Whether the partition goes missing as a result of a user deleting it, or a hardware error, the result would be the same: If the partition can't be found, GRUB packs up and refuses to let me use one of my other installs. That's simple, fundamental, bad coding. There's no reason why it couldn't pass control of the boot process over to the other loader it knows is there. Why wasn't it coded to do that?

    The boot sector's pretty tiny. It just about fits enough code to load grub from the boot partition, at which point Grub then retrieves its OS list from the boot partition also. If you delete the partition, there's nowhere to go. I'm fairly sure the Windows boot loader follows the same basic plan, and it won't get very far if you nuke your C: drive, either.


  • Registered Users, Registered Users 2 Posts: 354 ✭✭AndrewMc


    Khannie wrote: »
    After installing, get yourself automatix.

    Nearly all Ubuntu developers would say not to: http://mjg59.livejournal.com/77440.html


  • Registered Users, Registered Users 2 Posts: 354 ✭✭AndrewMc


    Well having spent a solid week experimenting with kubuntu I have the following thoughts

    Your experience was clearly less than stellar. I'd be genuinely interested if you could take the time to compare kubuntu and “regular” (Gnome) Ubuntu. I'm curious whether you find stuff like the Application Installer, the screen resolution switcher, or the DVD codecs work better (automatically) than those in Kubuntu.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    bman wrote: »
    Yes, a computer is a tool. But it is a pretty useless tool without the software that developers write to run on it! They are doing it for free so that anyone that wants to use it can. And the fact that you don't feel grateful to them for letting you use the "tool" (which lets face it, would be of no use to you without an OS) without charging you anything astounds me. Even if you hated the OS and never looked at it again, you still have no reason to blame the people that make it possible.

    It isn't a question of "blame". It's a question of "does what they've provided work?" If it doesn't work, then providing it serves no purpose - a broken tool is a broken tool, regardless of whether it's free or not. You shouldn't expect to thank someone who's handing out hammer handles on street corners on the basis that it's a free hammer. Again this is a cultural issue with software development, where "good enough most of the time" has always been an acceptable state of being. Software is an esoteric art and as such has been able to get away with standards of provision that simply don't apply for their colleagues in the hardware world.
    sobriquet wrote: »
    I'll start with the main point I want to make. Do you intend informing the Ubuntu/Kubuntu people of your experience? You don't appear to be looking for specific solutions, and after spending so much time informing random boardsies about it, I can't imagine you'd be opposed to relaying the experience to the relevant people who can do something about it. (Include your hardware configuration if you do.)

    I'm sure they're pretty tied up with coding Hardy, what? :)
    As with the swap space issue, Linux is different. Packages are split across different parts of the file system hierarchy - executables, libraries, header files, and content go into different places. This is for historical reasons. It has it's problems but I can't say its' ever caused me any trouble. BSD, Solaris and other UNIX-alikes share this historical baggage, Apple ditched it for OSX. Each has their problems and benefits.

    Oh, I have no doubt the installer is doing its job the way it needs to be done. My interest in that case was more along the lines of wondering why the recommended, or default install system is so closed off to the user. While apt-get and whatever worked nicely for me, I'm not about to start (for example) partitioning a 500 gig drive into separate little sections for /boot /etc /home to suit the filesystem, and that's more a reflection again on me expecting the OS to do the tedious work for me while I use the computer. The install process on kubuntu I found uniformly great - apart from adept problems - but that one issue just caught my interest towards the end, and then more as a "what happens when that 3 gig partition runs out? Will it cop on there's another 3 disks it could be saving stuff on?" sort of question.

    A friend of mine tried installing it recently on his MacBook, and it went not perfectly but better than yours, to my surprise. Then on his desktop, it broke in ways I didn't think possible. He's given up, and I don't blame him. Yours and his situations are real, and there's real place for criticism in the Ubuntu/Linux world (though again, it's best directed at the relevant Ubuntu folk). Plenty of us don't want to exist in an echo-chamber.

    Nevertheless, I think you'd be better served by laying off the ad hominem attacks and acknowledging that it's actually pretty unlikely that Linux developers in general are mountain dew addicted paranoiacs who don't care that their software doesn't work with your machine.

    Heh. :) I exaggerate for comic effect. On a serious note though, I'm not intending to attack the devs themselves, or the work they've put in. But I do think it's necessary to call something for how it operates, rather than what its intentions are. Good intentions are nice, but at the end of it, Linux has a job to do and it needs to do it well. Very well if it wants to increase its uptake, because it doesn't just have to be "good enough" or "as good as the competition" - it has to be so much better than the competition that going through the hassle of changing is worth it.

    Steveire wrote: »
    Are these kind of threads getting more common?

    I thought the topic deserved a more considered reaction than the "why I'm sticking with XP" thread started off with. :)
    Also you need to ask for help from people. Use IRC for real-time help if you don't have someone who can sit beside you and show you what to do when you hit issues.

    I would have to say my biggest frustration with the install, and the usage experience was simply this:
    A lot of the info, and the fixes I needed, were readily and helpfully available (some of them were unhelpful admittedly!). However, they were all online. Given that the devs know they have problems with Wifi, I honestly cannot make sense of the decision not to take those html files, and slap them on the DVD. They're in a format where they don't even have to write a hand-holding Mac Help Centre app to do it. It's as though they were so unsure of their own OS that they assumed that everyone would be dual-booting and switching OS's to find out what's going wrong. That's probably true, but I found that the simple worst offence of the whole thing. It was a glaring, shining example of how the devs could have made the whole thing simpler for users, newbies, and themselves, but chose not to take it, for no reason I can fathom.
    I don't really know what the comment refers to exactly. What is stupid?

    The above. :) I didn't mind the fixes I had to implement in that case. I minded that they were unnecessary and easily avoided.

    Anyway, all the feedback is appreciated and I'm sure I'll try linux again. In about another two releases' time. :)


  • Registered Users, Registered Users 2 Posts: 590 ✭✭✭bman


    If it doesn't work, then providing it serves no purpose - a broken tool is a broken tool, regardless of whether it's free or not.

    But it does work, you just didn't know how to get it to work, or didn't look up the solution and put in a bit of effort. The outcome you get affected by the amount of effort you put in. Simply wiping a partition when the graphics driver doesn't work is not effort imo.
    I'm not about to start (for example) partitioning a 500 gig drive into separate little sections for /boot /etc /home to suit the filesystem, and that's more a reflection again on me expecting the OS to do the tedious work for me while I use the computer.

    You only need three separate partitions if you want a more usable system. /boot, / and /home (swap as well but that isn't the same). And the installer doesn't do this because it is giving you the choice. Some people are happy with just one partition and others want their system to be more failsafe. This is not exactly tedious work. It takes a less than a minute.
    Linux has a job to do and it needs to do it well.

    And it does for many, many people. But you have to extend yourself a bit as well (mostly due to hardware manufacturers not playing nice).
    I thought the topic deserved a more considered reaction than the "why I'm sticking with XP" thread started off with. :)

    Yep, at least you put in the effort of giving an explanation of what went wrong.
    I would have to say my biggest frustration with the install, and the usage experience was simply this:
    A lot of the info, and the fixes I needed, were readily and helpfully available (some of them were unhelpful admittedly!). However, they were all online. Given that the devs know they have problems with Wifi, I honestly cannot make sense of the decision not to take those html files, and slap them on the DVD.

    They could not put help on the DVD for all the different types of Wifi cards. What would be next, all the different types of graphics cards? They are trying to fit as much software on the DVD as the can. I want that software, not help files for all sorts of combinations of hardware, as I'm sure most other users do.


  • Closed Accounts Posts: 12,807 ✭✭✭✭Orion


    Khannie wrote: »
    A base install works, but is more or less useless until you install the proper video drivers etc.
    Envy to install video drivers. It even has a text console if X gets screwed.
    Khannie wrote:
    I'll offer you this advice: After installing, get yourself automatix. It installs all the good stuff for you. video drivers, flash, mp3 support, dvd support, java for your web browser, skype, etc. etc. etc.
    I tried automatix myself and didn't like it at all. apt-get or even the gui Synaptic does the job for me.
    Khannie wrote:
    For a file manager, I use "thunar". It's actually from the xfce window manager. It's closer to windows explorer and very lightweight. I love it. I've attached a screenshot.
    Ugggglleeeeeee :p I'll stick to Nautilus thanks :D


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    bman wrote: »
    But it does work, you just didn't know how to get it to work, or didn't look up the solution and put in a bit of effort. The outcome you get affected by the amount of effort you put in. Simply wiping a partition when the graphics driver doesn't work is not effort imo.

    If showing a Black Screen and making the system unresponsive is your idea of a "working" piece of software, then I despair. For the record, I tried 3 drivers on exactly the same hardware:
    1: Built-in kubuntu X driver.
    2: Standard install fglrx driver.
    3. ATI catalyst 8.1 driver.

    Of these three, Driver#2 did not work. All other paramaters were the same. This means that Driver #2 is broken and does not work. This is not a case of PLBKAC. As regards "effort", it takes significantly less effort for the guy who makes the DVD images to replace a broken fglrx driver with a working updated one (swift revision times are supposed to be one of the cornerstones of Open Source), than it does for me to reboot into a working OS, look up the solution online, then spend time fiddling in the boot console to fix the problem.
    You only need three separate partitions if you want a more usable system. /boot, / and /home (swap as well but that isn't the same). This is not exactly tedious work. It takes a less than a minute.

    On the contrary, I feel it is tedious work. Why should I repartition all my existing (dual boot) drives just to suit the OS's file system? The OS is supposed to be working the hardware for me, the drives are formatted in a system it understands, so if it can't bring itself to install onto empty space without me holding it's hand, that's not my problem. :) But I did say that this issue was more a case of me not being prepared to do it than a fundamental design flaw.
    They could not put help on the DVD for all the different types of Wifi cards. What would be next, all the different types of graphics cards? They are trying to fit as much software on the DVD as the can. I want that software, not help files for all sorts of combinations of hardware, as I'm sure most other users do.

    If MS can fit the hideous bloated mess that is Vista on a DVD, then I'm pretty sure Ubuntu can stick a couple of megs of HTML on theirs. If it installs fine off a CD, then they've got a spare 4 gigs of space on a DVD to fill. They don't even need docs for every piece of hardware out there - they just need the official help docs and the troubleshooting stickys - that would solve a hell of a lot of newbie questions, and install problems.


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    Macros42 wrote: »
    Envy to install video drivers. It even has a text console if X gets screwed.

    Hi. My name is ctrl + alt + F1. :p
    Macros42 wrote: »
    I tried automatix myself and didn't like it at all. apt-get or even the gui Synaptic does the job for me.

    Base install + automatix = functional ubuntu IMO. It's the first thing I do after an install.
    Macros42 wrote: »
    Ugggglleeeeeee :p I'll stick to Nautilus thanks :D

    To each their own. I think Nautilus is lacking. Thunar has some _lovely_ features. Chief among them is multiple file rename (even allows regex based). Also allows your own scripts in the right click menu (I use an rm -rf script for avoiding the trashcan, etc.). Not sure what you find so ugly about it. I actually think that the "classic" windows explorer is the best file manager I've used to date. :) Maybe that's where we differ.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    I haven't read all of the thread, but I did want to add this: There's a growing trend of people trying linux and being a bit less than happy with it. When they come on here to express their dissatisfaction there's a growing trend of lynching them.

    Slutmonkey: Your experience was a sh1tty one. Plenty of people have had similar ones. I seem to find fixes for things a lot faster these days than I used to and I put it down to experience rather than my being some super reader / super sleuth. I've had a lot of experience using linux now, and a lot of it has been negative (nearly all in the early days), but the last year or so has been superb. My advice: Give 32bit ubuntu (not kubuntu) a lash. Install automatix. Expect trouble with your wifi. Expect trouble with other things. Expect to read a lot. Expect a better computing experience after all of that.

    Good luck.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Slutmonkey57b, You say Xp didn't run as fast as you liked, what are the actual specs of the pc?

    I've installed ubuntu on a few of my pc's and didn't really have many problems, had way more issues when i installed Vista for the first time, but i'm willing to put in the effort.
    Give Ubuntu a go, I'm sure you'll have more success.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    PC specs should be fine for both XP and ubuntu - athlon64 2800, Radeon9800Pro. Not cutting edge hardware, not obsolete, just old enough that driver problems should have been ironed out. In fairness XP isn't slow per se. It's just not nearly as responsive as my 2k install - until I used that again recently I really wouldn't notice a slowdown on xp.


  • Closed Accounts Posts: 12,807 ✭✭✭✭Orion


    Slutmonkey57b:
    All I can say is that you have approached this with the completely wrong attitude. You are attempting to compare Windows to Linux when there is no comparison. You are persisting in thinking in MS terms towards an OS that is completely alien. No linux user will reformat/reinstall just because something fails. That's purely an MS user thing. I'm relatively new to Linux myself to put this in perspective.
    If showing a Black Screen and making the system unresponsive is your idea of a "working" piece of software, then I despair.
    I've had the Black screen that you referred to - I simply installed the correct video drivers and all was good again. The system was actually fully responsive - it was just X that wouldn't load. If you'd actually tried to fix it you could have. But you're negative approach was to format/reinstall - the age old MS Mantra.
    On the contrary, I feel it is tedious work. Why should I repartition all my existing (dual boot) drives just to suit the OS's file system?
    The tedious work that you refer to actually makes things a hell of a lot simpler when you want to change distro or do a reinstall for whatever reason. You install a new edition of Linux and all your /home data is untouched. Again you're thinking in MS terms where a full reinstall means everything is gone. Separate partitions is just good practise.
    If MS can fit the hideous bloated mess that is Vista on a DVD, then I'm pretty sure Ubuntu can stick a couple of megs of HTML on theirs. If it installs fine off a CD, then they've got a spare 4 gigs of space on a DVD to fill.
    Ubuntu is there for everybody not just to suit you. Have you any idea how much it would cost to download this bloated DVD that you are suggesting in South Africa, for example? Try one quarter of the average wage! If you want a 'bloated' download of Linux then get Debian - 3 DVDs of love for you if you choose that particular download - there's 13Gb of packages to choose from. And seeing as Ubuntu is a fork of Debian I'm sure you'll find something else to complain about.


  • Closed Accounts Posts: 12,807 ✭✭✭✭Orion


    Khannie wrote: »
    Hi. My name is ctrl + alt + F1. :p
    Yadda Yadda Yadda :p

    Actually the time I needed the -t was after an xserver update - I didn't uninstall the drivers first. TBH I was pissed when I just hit the update button. Upshot was that X was ****ed. Envy -t + reinstall nVidia drivers = working again.


    Khannie wrote: »
    Base install + automatix = functional ubuntu IMO. It's the first thing I do after an install.
    Base install + install what I need using apt-get = functional ubuntu. :p And for a Gentoo geek I'd expect you to understand :D
    Khannie wrote: »
    To each their own. I think Nautilus is lacking.
    Nautilis is as basic as hell tbh. But it suits some purposes. Krusader FTW tho. fc put me on to it today - love it already - even in Gnome.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    Macros42 wrote: »
    Slutmonkey57b:
    All I can say is that you have approached this with the completely wrong attitude. You are attempting to compare Windows to Linux when there is no comparison. You are persisting in thinking in MS terms towards an OS that is completely alien. No linux user will reformat/reinstall just because something fails. That's purely an MS user thing. I'm relatively new to Linux myself to put this in perspective.

    For the record my reformats/ reintalls were done as follows:
    1: Initial install.
    2: Wiped the partition due to not having the time to debug the system and figuring i'd try it again when I had the time.
    3: Reinstall due to not being able to get into windows again and attempts to get Wifi working. Helpful docs found on the official docs section of the website recommend a clean install to get wifi working properly due to it being "flaky" in all versions of Linux.
    4: Reinstall of 32bit version as 64bit support isn't up to standard.

    Reformatting and re-installing is not a "MS User" thing. I've never had to redo a windows install just to get a USB peripheral working. In fact the only time I've ever had to re-install windows is when I've been deliberatly fiddling with it and breaking it.

    I've had the Black screen that you referred to - I simply installed the correct video drivers and all was good again. The system was actually fully responsive - it was just X that wouldn't load. If you'd actually tried to fix it you could have. But you're negative approach was to format/reinstall - the age old MS Mantra.

    I already responded to this. I tried 3 drivers. One of the three didn't work and produced a black screen.

    This means that the driver in question is broken.

    Stop blaming the user just because the driver isn't up to standard.

    You are now telling me that the system was responsive. Were you in the room at the time? I didn't see you. Maybe you were hiding under the desk? I am telling you the system did not respond. I was there. I should fuc.king know.
    The tedious work that you refer to actually makes things a hell of a lot simpler when you want to change distro or do a reinstall for whatever reason. You install a new edition of Linux and all your /home data is untouched. Again you're thinking in MS terms where a full reinstall means everything is gone. Separate partitions is just good practise.

    I am not thinking in MS terms. I am thinking in "dual boot" terms. What you are proposing is that I set some of my existing drives (which happen to be full of data I need) as linux partitions to suit its filesystem. The other OS I am booting is perfectly capable of seeing disks as disks-and using whatever space it finds, regardless of whether it is an NTFS or Ext3 format. If I want to install something from windows, I can use spare space on the Linux partition.

    I'll quote you again:
    when you want to change distro or do a reinstall for whatever reason.
    I thought needless re-installing all the time was an "MS User" thing, not a Linux thing.

    Ubuntu is there for everybody not just to suit you. Have you any idea how much it would cost to download this bloated DVD that you are suggesting in South Africa, for example? Try one quarter of the average wage! If you want a 'bloated' download of Linux then get Debian - 3 DVDs of love for you if you choose that particular download - there's 13Gb of packages to choose from. And seeing as Ubuntu is a fork of Debian I'm sure you'll find something else to complain about.

    The Ubuntu CD/DVD image is available free-all over the world. It wouldn't cost a user in South Africa anything to get a copy in the post. That's the point of Free Open-source software, isn't it? Note I called Vista bloated, not Ubuntu. My point was that Ubuntu actually installs off a CD. Given the extra 4 gigs of space, including a couple of help files isn't a big deal. It's certainly more useful than, for example, including a broken fglrx driver on the dvd image.


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    3: Reinstall due to not being able to get into windows again and attempts to get Wifi working. Helpful docs found on the official docs section of the website recommend a clean install to get wifi working properly due to it being "flaky" in all versions of Linux.

    For the record: That's just bad documentation.
    You are now telling me that the system was responsive. Were you in the room at the time? I didn't see you. Maybe you were hiding under the desk? I am telling you the system did not respond. I was there. I should fuc.king know.

    Take it easy there big fella. Linux is not like windows. A black screen does not mean that the system is unresponsive. Not in the slightest. Lack of response to ping, or inability to use ssh, one of the TTY's, etc. and I would class the box as unresponsive. Many many times I've had a black screen but a perfectly functional machine. Most of the linux boxes I use don't have a monitor, keyboard or mouse attached to them at all.


  • Closed Accounts Posts: 413 ✭✭sobriquet


    Again this is a cultural issue with software development, where "good enough most of the time" has always been an acceptable state of being. Software is an esoteric art and as such has been able to get away with standards of provision that simply don't apply for their colleagues in the hardware world.
    You're right, it is, and it's unfortunate, but I honestly don't see what can be done about it. The number of devices out there is huge, and the interactions between them much larger again. If every software producer, be it MS, Sun, Apple or open source developers, waited until they had a failure rate comparable to hardware vendors, little software would ever get released. It's possible, but it's incredibly expensive. Is the only option to not release it?

    The issue is further complicated when, in the case of ATI's drivers for Linux, you're dealing with a binary object that you can't support or debug yourself provided by a company that has no apparent interest in ensuring compatibility. As the distro developer, you have a choice to include or omit that driver with known issues. Omitting it means that people will have graphics, sure, but no acceleration. Including it means putting up with possible major issues for the end user if they opt to install it.
    I'm sure they're pretty tied up with coding Hardy, what? :)
    Which they're doing in order to improve it. That takes feedback.
    a reflection again on me expecting the OS to do the tedious work for me while I use the computer. The install process on kubuntu I found uniformly great - apart from adept problems - but that one issue just caught my interest towards the end, and then more as a "what happens when that 3 gig partition runs out? Will it cop on there's another 3 disks it could be saving stuff on?" sort of question.
    That's fair enough. Personally I like to specify what goes where myself, for reasons bman outlined above. But doesn't the Ubuntu install include a 'Just click ok' option that automatically configures your partitions for you? I'm pretty sure it does. As to automatically expanding, it doesn't do that, though many partition formats support growth. It'd be nice though certainly. Does windows do it?
    A lot of the info, and the fixes I needed, were readily and helpfully available (some of them were unhelpful admittedly!). However, they were all online. Given that the devs know they have problems with Wifi, I honestly cannot make sense of the decision not to take those html files, and slap them on the DVD.
    Again a good point. Not one that would occur to me personally, but then I'm the sort to hit Ctrl-Alt-F1 when I get a video card error. I would say though that if you feel you might do one thing to improve the situation, contact the Ubuntu folk about this point and add your voice to the idea that these docs should be included as standard. (Albeit with a warning that the online docs will be more up to date and correct.)
    What you are proposing is that I set some of my existing drives (which happen to be full of data I need) as linux partitions to suit its filesystem. The other OS I am booting is perfectly capable of seeing disks as disks-and using whatever space it finds, regardless of whether it is an NTFS or Ext3 format. If I want to install something from windows, I can use spare space on the Linux partition.
    To clarify, you don't need multiple drives to use multiple partitions. As to Windows using whatever space it finds regardless of format - really? Since when? I've never seen Windows recognize anything other than FAT or NTFS partitions. By 'share space on the Linux partition' do you mean FAT/NTFS space on a disk primarily given over to Linux?
    Khannie wrote: »
    I haven't read all of the thread, but I did want to add this: There's a growing trend of people trying linux and being a bit less than happy with it. When they come on here to express their dissatisfaction there's a growing trend of lynching them.
    I don't quite see it as a growing trend (wouldn't know tbh) but likewise I don't like it. By way of a caveat though, the OP mentioned that his original jabs at the developers were meant as humour but they didn't come across that way. I've found much of it combative.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    sobriquet wrote: »
    You're right, it is, and it's unfortunate, but I honestly don't see what can be done about it. The number of devices out there is huge, and the interactions between them much larger again. If every software producer, be it MS, Sun, Apple or open source developers, waited until they had a failure rate comparable to hardware vendors, little software would ever get released. It's possible, but it's incredibly expensive. Is the only option to not release it?

    Absolutely the levels of accuracy cannot be as high for software as they are for hardware. But that doesn't mean that the only choice is "buggy and released" or "nothing at all". Think of comparisons between two software packages you've used on any platform to perform the same function (Azureus vs uTorrent is my favourite for obvious reasons) - in any case the superior software is the one which shows the evidence that it has been worked on until it's ready, rather than until the release deadline comes up, and the devs go "eh, we'll fix it next time" - only to be overcome with feature creep. I found it telling using kubuntu that the stable, quality apps uniformly had a horrible UI experience. There seemed to be an law of inverse proportion between the polish of the UI and the stability of the underlying code.
    The issue is further complicated when, in the case of ATI's drivers for Linux, you're dealing with a binary object that you can't support or debug yourself provided by a company that has no apparent interest in ensuring compatibility. As the distro developer, you have a choice to include or omit that driver with known issues. Omitting it means that people will have graphics, sure, but no acceleration. Including it means putting up with possible major issues for the end user if they opt to install it.

    An excellent point. However, both drivers #2 and #3 were closed source and provided by ATI. Therefore a working, closed source, proprietary driver *could* have been provided instead of a broken one.
    To clarify, you don't need multiple drives to use multiple partitions. As to Windows using whatever space it finds regardless of format - really? Since when? I've never seen Windows recognize anything other than FAT or NTFS partitions. By 'share space on the Linux partition' do you mean FAT/NTFS space on a disk primarily given over to Linux?

    Neither XP or kubuntu would recognise their rivals formats "out of the box". Having installed (in both cases free) drivers, both could read and write each other's partitions. What I mean in this case, is best described by example.

    Say you have three physical drives. You use drive A as your install drive for both XP and linux in separate partitions, and you've installed support for both OS's to read each other's file formats. Drives B & C house data that you've brought over from another machine. Drive A runs out of space, drives B & C still have space you can use, but partitioning them would potentially destroy the mission critical data they house. You need to download and install Azureus, for all the legitimate Linux ISO's that torrenting was designed for. :) XP will realise there is no space on drive A, but will ask you where you want to install it. XP does not care which physical device you're using, and isn't limited by its hereditary file system structure. Will linux expect me to create a partition on one of the other drives and assign it a part of the linux file structure before it will install? Given that I can't do that, what are the options?
    I don't quite see it as a growing trend (wouldn't know tbh) but likewise I don't like it. By way of a caveat though, the OP mentioned that his original jabs at the developers were meant as humour but they didn't come across that way. I've found much of it combative.

    It's a metaphor for the unending struggle of sudo apt-get install module-assistant build-essential fakeroot dh-make debhelper debconf libstdc++5 linux-headers-$(uname -r) versus the new initiate to the linux experience. :)


  • Advertisement
  • Closed Accounts Posts: 1,467 ✭✭✭bushy...


    Macros42 wrote: »
    Slutmonkey57b:
    .......... I'm relatively new to Linux myself to put this in perspective.

    If you want a 'bloated' download of Linux then get Debian - 3 DVDs of love for you if you choose that particular download - there's 13Gb of packages to choose from. And seeing as Ubuntu is a fork of Debian I'm sure you'll find something else to complain about.

    Why would you download 13Gb of stuff ?
    If you have a net connection you just need a netinstall cd ( about 170megs) . Boot that and the installer will start and get what it needs from the net.
    If you have a slow connection at home , just the first CD will get you a long way.

    If you want to replace a (working ) Windows install , just go here and click the buttton :

    http://goodbye-microsoft.com/


  • Closed Accounts Posts: 413 ✭✭sobriquet


    But that doesn't mean that the only choice is "buggy and released" or "nothing at all".
    Perfectly true; but the thing is, your buggy and released is my hey-it-just-works. I know this is going to sound flippant and dismissive but the fact is that for a distribution, they want to release the software that people want to see released. When their bug tracker is telling them that, say, few desktop machines and a larger number of laptops are having any problems with a specific package, do they withhold it until they've reached a critical point? Given the heterogeneity of hardware out there, where do you pick the cut-off point?

    Of course, this could simply come down to the fact it works on my hardware (and I'd be arguing your side if it didn't ;)) - but I think it holds on its' own, picking that cut off point is hard, and you'll invariably end up leaving (a few|some|a lot of) people out in the cold. The Debian project for instance, had glacial release cycles for exactly this reason, and ended up frustrating their users. (Then again, they've got about 15,000 packages for each of about 12 architectures, so it's another kettle of fish.)
    Therefore a working, closed source, proprietary driver *could* have been provided instead of a broken one.
    I don't follow this; how? (I'm predicating this on the 'could' being on the part of the Ubuntu folk, is that right?) This is only possible if the driver provider is bothered to fix and release in a timely manner, and that's not always been the case. So it comes down to my point that it's a hard choice for the distro maintainers - release a driver they can't fix with known problems on some configurations, or omit it and annoy anyone who wants it and may have the correct hardware.

    This is changing mind you, with the full release of ATI and Intel specs, we'll see fully open source and more maintainable drivers going into the mainline kernel tree over the next few years; and in fairness, I've found NV at least to be solid and stable, so fair play to them for providing drivers for an unpopular platform. Hopefully they'll join the bandwagon ATI/Intel bandwagon too.
    Will linux expect me to create a partition on one of the other drives and assign it a part of the linux file structure before it will install? Given that I can't do that, what are the options?
    Ok, that's clearer. Short answer, yes, it will expect space to be given over to it. In theory you could install software anywhere so long as there's a driver that has read/write support, thus it could be possible with ntfs3g. You'd force the packages to be installed to wherever that partition was mounted. Whether you could trust it to the extent you could trust the ext3 or reiser etc drivers, I don't know, I've not used it.

    Thing is though, even if you're primary disk is crammed full, and your secondary disk with important data is, say, 75% full, you could repartiton it to a 75%/25% ntfs/ext3fs without losing data - it's certainly possible if you wanted to. Obviously not ideal (and formatting gives me the heebie jeebies), but then under-budgeting for disk space if you're doing anything non-trivial is a losing proposition.
    sudo apt-get install module-assistant build-essential fakeroot dh-make debhelper debconf libstdc++5 linux-headers-$(uname -r)
    It's funny y'know, I used to be able to rattle out stuff like that, but Ubuntu has domesticated me, I can honestly say I've done little in the way of that over the past couple of years, other than the odd 'apt-cache search foo' 'apt-get install foo'. I mustn't be a Real Linux User.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 93,582 Mod ✭✭✭✭Capt'n Midnight


    sobriquet wrote:
    Thing is though, even if you're primary disk is crammed full, and your secondary disk with important data is, say, 75% full, you could repartiton it to a 75%/25% ntfs/ext3fs without losing data - it's certainly possible if you wanted to. Obviously not ideal (and formatting gives me the heebie jeebies), but then under-budgeting for disk space if you're doing anything non-trivial is a losing proposition.
    Don't forget you can use gparted or qtparted to resize partitions and to a certain extent move them, less hassle than format and restore. ( you can skip the backup stage if brave )


Advertisement