Whuut?!

Friday, May 15, 2009

Unified 3d-party Packaging for Linux

While this topic has been, and are stilldebated over and over, I think two aspects are fairly often missed in the debate.

While the discussions often boil down to dpkg vs. rpm, I as a systems administrator and Linux users have a few practical problems that is really frustrating.

Problem #1 - Big fat Lie: "Linux gives you Single Point of Software Management and Updates"

This is often touted as one of the many advantages in Linux over both Windows and Proprietary Unices. And very well, in theory if you manage to ONLY stick to the packages provided by your distro, any popular distro does a decent job creating a unified software management system.

The problem comes with the cold hard "real world". In practice, every systems administrator I know is forced to think outside the box now and then, either for proprietary software or free software for some reason not yet packaged, or packaged in an old/broken version. (QBzr in Jaunty *mumble*) As soon as that happen (usually within 5 minutes after install), the whole software management system starts to crack, and there is no longer ONE system in control, but several, and often rouge software outside all control systems.

Problem #2 - Beta-testing is reserved for the "elite"
Again, in theory anyone can allow beta-testers to download their beta-versions/svn-snapshots of their software, and have the user install, test, and give feedback.

In practice, this ability is reserved for the top-5% of users with enough skill and time to download, configure, make and install the software (including all dependencies). The other 95% of users, less time or hacking skills could probably give valuable feedback and wider testing, but since they can't easily install the software, they can't really test or comment. They get the software roughly 6 months later, when the mainstream distros have caught on, and by then the upstream developers is already working on something else. This creates huge latencies in the feedback-release cycles, which enforces the one of the few bad effects of open source, that many programs get written for the 5% elite, and the other 95% user-base is more or less coincidental.

Problem #3 - Gaming on Linux is not for Joe Average (a symptomatic case, not really a problem itself)
This have often been stated "Gaming on Linux sucks". While I mostly disagree with that statement, it holds some merit, but not due to the regular argument that the selection of games for Linux are so limited. While yes, there are more games for "that other platform", the real problem with games for Linux is that they are for a newbie-user really hard to find, install and keep updated.

The first step (once you've found a game you want to try out) is to check if your distro has it. But since none of the big ones come with a really good selection of games, the user is probably required to got to the website of the game, and use the games own, often strange and tedious, process. This almost always requires (or at least encourages) at some point to open the console for some "terminal magic" (you know chmod +x, sudo install to /opt, etc). Then comes the weekly game-balancing-patch, which is discovered when at first when trying to connect to some server (assuming multiplayer), which sometimes requires a new download, and often an even more tedious patch-application-process. (Or even worse, due to auto-download, requires all gaming users to have write-access to the game install-dir, something that the installer certainly did not care about.) This, I believe more then the selection of games itself (especially if you count Wine) makes Linux games just a bit too challenging to even test for many users.

The problem for the game creators is that same as for all software, (except that maybe some games gets more patching with new content, rule-adjustments etc.)
The difference with games compared to much other software is that since they make so big changes, since there are so many of them, since they are often not prioritized software (by say RedHat, Novell or Canonical), and especially since they often require 0-day updates for multiplayer to work, it seems like regular distro-packaging simply can't keep up.

Requirements of a Solution
The above problems are in no way a complete list of software management issues under Linux, far from it, but one of those that hit me most often as a user and administrator at home and at work.

These problems would not be technically difficult to solve, for any particular distro, but for it to be effective, it will require a great deal of communication and agreement between distros.

The problem is not really on standardized package-formats in the usual form, that whole debate is largely over-rated. Noone sane would try to take KDE4 from Fedora and jam it into Ubuntu anyways (and I don't think it should be encouraged), but there is a real need for better handling of 3d-party applications, in all distros. These packages includes proprietary applications, but also open-source applications such as niche-applications, games, and other apps when the user simply wants to try "the bleeding edge".

Such 3d-party package system should:
  1. Be able to safely install, upgrade, revert and remove packages
  2. Support crypto-signatures and trust-networking such that the user can assess the quality and risk of installing a certain package beforehand
  3. Support automatic checking for updates, and inform this in a single subtle manner.
  4. Be linked to the native package-system in such way that dependencies can be resolved, and that the gui can be integrated
  5. Support peer2peer (torrents) for large downloads
  6. Work with LSB for dependency-control
  7. Work on multi-architecture, as well as arch-independent packages.
  8. (Critically important, and ultimately where it's likely to fail) Be well-supported by major distros for the network effect to work positively
  9. Preferably be distro-neutral, to avoid distro-lockin.


Right now, the thing that comes closest to this is probably Ubuntu:s Personal Package Archives, which roughly solves requirements 1,2,3,4,6 and 7. Then, since the 3d-party stuff is more important on the desktop (less of a problem for server admin than average desktop users), the growing popularity of Ubuntu itself may end up solving the last and critical requirement.

So basically, unless everyone sits down and agrees soon, apt/dpkg may likely become the de-facto standard for 3d party software. While this would practically solve the current problems, I personally would be sad if that becomes the case, as such distro-tied de-facto standard could risk some stagnation, and hinder the diversity that makes Linux so great.

Monday, April 20, 2009

How low can mem go?

I just read How Slow Can Linux Go over at Computer World.

One thing I never get when people are discussing underpowered systems and lightwheight systems, is why RAM size is so horribly overlooked? Since when is 1gig of ram not plenty? Even many linux-apps completely suck at ram usage.

Pidgin is an excellent example. What is it's purpose? Displaying a list of buddies, which you can contact, see availability, history, and some other miscellaneous functions. How much ram should we expect to see from that app?

Let's break it down:
First: Data
Pixmaps
Pidgin would probably want to ram-cache your buddy-icons. Let's assume we want to support a decent size of buddy-icons, 64x64. Times 32-bit color-depths (with alpha), we get 16K/icon. Assume we've got 512 buddies (yeah, we're popular), and we get 8 MB of ram for this.
History Index
We of course want quick access to our history without chewing all disk. I'm not sure exactly how to correctly assess the requirements here, but let's assume you've got 128K messages in your history, and that each index entry takes up 8 bytes and that the structure consumes twice as much space as needed = 2MB.
Other
Pidgin probably needs to map some fonts, some internal structures for availability, some text about your buddy details, some icons for own buttons etc. as data, so let's guess that everything else takes as much as history and data combined.
Summary
With the above mentioned naive guestimates, we would expect Pidgin to hold some 20MB of data in memory. My pidgin right now consumes 77MB, almost 4 times the size. Complete resident set-memory is 33MB, well over 50% more than the above can see a realistic need for.
Also, remember that the above examples is probably exagerated on the large side. Pixmaps could probably compressed a couple of times in ram (and there certainly isn't 512 buddies in my list), the history-index doesn't really need to be in ram, and the other stuff could probably be optimized/compressed down to maybe 4MB easily.

Next: Shared Memory (or lack of)
Here is where Pidgin, as most Linux-apps sadly really fails. Exploring /proc//maps, we really see a LOT of libs and resources that COULD be shared by other processes. All in all, no less than 50MB is mapped files. The vast majority of these files are common stuff, libSASL, Xlib, gstreamer, and /usr/share/icons/hicolor/icon-theme.cache. However, sadly only 1/5 of these memory is marked as shared by in top.

Grand Finale
If there's one thing I have a hard time accepting is sloppy coding "since hardware is cheap". I want to buy new hardware to get BETTER performance. Not do a hardware upgrade just to match the same performance I had before the last software "upgrade".

Thursday, January 15, 2009

On Windows and Hardware-support

I visited my sister the other day. After a nice dinner, when we were sitting down talking, they mentioned not getting their TV-capture-card working correctly. I offered to have a look at it, but warned them that I really, really haven't got much clue about windows (they run WinXP on the machine) these days.

The symptoms of the card was that all capture programs kindof worked (they detected "a USB video device"), but none of them could use it's builtin tuner. I tried almost everything. Reinstalled the program, reinstalled the driver several times, changed USB-port, downloaded new drivers, and an updated version of the program, read the manual, tested some switches on the hardware and so on.

Some 4 hours troubleshooting later, I detected the problem. When installing the driver, I had been ignorant enough to let windows "search for the driver". I had pointed out extra locations on the CD, and in the download directory, and the driver installation guide probably saw those drivers.

Here's the problem; when the driver install guide saw the drivers I were provided by the manufacturer, it did not find the "Microsoft approved" signature on them. It then silently ignored them, and fell back on some internal half-ass driver for some similar device that happened to have a similar video-chip but no tuner. No mention whatsoever.

This case gave me two points:
  • Many people have real problems installing even simple USB-devices on their windows-machince as well. (People are constantly asking me to help them get their Ipod connected, for instance)
  • Linux, when trying to install drivers, either installs the right driver, and it works like magic or it falls back to a half-working driver and tells you that you're not running optimal config, or it doesn't manage to install at all.
In the not-at-all case, I'd say the "hand-compile-sing-praise-to-torvalds-and-then-insmod-method" is roughly equivalent to the trial-and-errors I've had in Windows with "Let me select my driver" followed by a list of some 16 similar devices, none of them named like the device I'm trying to install. Both in Windows and in Linux it's mostly a game of chance.

The one difference is that in Windows, if the hardware is even remotely old (older than 1½ times it's warranty), it'd probably save you a lot of time to just give up now and buy a new equivalent. In Linux however, if the hardware is very new, you may have to sit patiently and wait for months before a working driver is established. That, for most people here in the rich western world actually sucks more than throwing away a web-cam just because it's some three years old and I were foolish enough to install XP SP3. I dont know if that says more about western culture than Windows vs. Linux, though.

Really, I've had WinXP ask me for USB-mouse-drivers, with a silent note in the task-bar that it did not recognise my mouse, and I've had to walk all the way through my computer -> properties -> device manager -> the device -> the install wizard BY KEYBOARD, before getting the mouse up and running. (Guess if I missed Linux command-line that time?) I wonder what would have happened if the keyboard had not been an old and tried PS2-one. :S

So, for anyone claiming that "Windows have much better driver support than Linux", I would guess they're turning a blind eye towards their windows-related frustrating experiences, as I probably am regarding my Linux-related annoyances.

Thursday, December 04, 2008

Ipred

Suck, mummel och mutter.

Har läst en massa debatt runt Ipred, och kan bara konstaterat att en del inlägg i debatten är ungefär lika vilsna som själva lagförslaget.

Jag tror att de flesta är överens om att vi måste ha kvar upphovsrätt och se till att den efterlevs. Har man gjort ett jobb bör man också få bestämma hur det skall distribueras, och om man så vill få kräva att ta vilken betalning man vill för jobbet.

Problemet med Ipred är ju att den inte skyddar upphovsmännen, utan rättighetsinnehavaren, givet att rättighetsinnehavaren har pengar och resurser att driva denna jakt på fildelare. Alltså gör Ipred inte ett dugg för den artist som försöker gå sin egen väg för att faktiskt tjäna pengar på sin musik, eftersom Ipred snarare tvingar in Artisten i ett större skivbolag som från och med nu uppenbarligen skall vara den huvudsakliga utredaren av intrång i copyright-frågorna runt musik. Att artisten inte tjänar en spänn under ett skivbolag är ju inget nytt för någon egentligen, men om det var någon som tvivlade gav Jonas Almquist en rätt bra redogörelse i frågan. http://www.aftonbladet.se/debatt/article3824576.ab

Dessutom är det ju frågan om varför just medie-branschen skall ha speciella tillstånd för detta? Om vi vill avanonymisera internet, vilket självklart inte bara vore negativt, bör ju vilket företag som helst få begära ut personuppgifter vid misstänkt dataintrång för att underlätta sin utredning. Men varför stanna där, egentligen bör ju även fadern till våldtagna tonårsdottern få ut namn och address på gubbsjuka "stina16" och för all del skulle jag själv vilja ha ut namn och telefonnummer till en del puckon som spammar min dator med rätt lama försök att ta sig in.

Istället verkar en del tycka att "fildelaravgift" borde införas i internet-abonnemangen, vilket väl vore ungefär detsamma som att införa "fortkörningsavgift" i vägskatten. Själv torrentar jag rätt massivt. Jag testar rätt ofta nya linux-distar, och laddar ned del musik och film för fri distribution. Dock skulle knappt någon av de upphovsmännen få del av någon "bredbandsskatt", så det vore grovt orättvist.

Ifall man verkligen vill komma åt problemet med fildelningen borde man istället titta över följande:
1. Ålägg bredbandsförtagen att ta sin abuse-avdelning på allvar. Lagstifta att bredbandsoperatörer har skyldighet att bära klagomål till slutkund.
2. Låt mediaindustrin skicka in sina klagomål på ip-addresser som laddar ned eller upp material. (Och gör det massivt, även när bara ett fåtal filmer eller skivor laddats ner)
3. Låt bredbandsleverantören bära klagomålet till slutkund. Bredbandsleverantören kan också bifoga information om t.ex. hur man skyddar sig genom att säkra sin trådlösa accesspunkt, antivirusprogram o liknande.
4. Då bredbandsleverantören får tillräckligt mycket upprepade klagomål på enskilda abonnenter, kommer dessa att bli rena förlustaffärer, och bredbandsbolaget kan då välja att helt stänga av dem (helt eller delvis, temporärt eller permanent) med hänvisning till brott emot Sveriges Rikes lag.
5. Polisen får nu, istället för att ägna massa tid på att jaga enskilda fildelare, endast ägna en betydligt mycket mindre tid åt att kontrollera att bredbandsleverantörerna sköter sina ålägganden.

På så sätt kan man enkelt hitta en lagstiftning som
1. Inte inkräktar på individen
2. Ser till att belysa, även för yngre generationer att copyright faktiskt är lag, och att vad du gör på internet också syns.
3. Inte innebär några direka kostnadsökningar för enskild individ.
4. Gör olagliga användare till en dålig affär för bredbandsoperatören.

Självklart kommer bredbandsleverantörerna tycka att detta är en dålig idé, eftersom de kommer eventuellt förlora en del kunder, och uppleva att de får stora kostnader för att driva detta. I verkligheten kan dock detta ganska enkelt automatiseras, genom att t.ex. medieindustrin mailar @abuse..se, mailet bärs sedan fram till den verkliga abonnenten som innehar IP-numret just då, och ISP behöver inte göra annan administration än att godkänna nya avsändare (undvika spam), samt följa upp antalet anklagelser för att bedömma när det är dags att dra sladden.

Slutligen, litet tips för alla er som ser medie, och i synnerhet musikindustrin som stora, elaka och som stjäl artisternas pengar och rättigheter: Put your money where your mouth is. Sluta köp deras musik, ladda inte ned den heller för då bidrar ni bara till att hålla den populär. Se istället till att konsumera de lagliga alternativ som finns, som faktiskt respekterar artisternas önskemål istället, och se för guds skull till att öppna upp plånboken och donera litet pengar till det som gör dig glad o nöjd:

http://www.magnatune.com/ - Musikbolag där du köper musiken för vad du själv vill betala, och artisterna får alltid hälften.
http://www.jamendo.com/ - Musiksajt där all musik går under Creative Commons-licens, du får alltid ladda ned och kopiera musiken fritt (givet att du också bibehåller information om vem som gjort musiken). Du kan donera pengar direkt till artister, och sajten tar endast 5% av donationen.
http://www.last.fm/ - Gratis personlig radio-station som lär sig vad du tycker om, om hjälper dig hitta ny grym musik
http://www.getmiro.com/ - Samlar en massa gratis s.k. Podcasts (serier av video-snuttar). Du kan beroende på smak lätt få upp ca 20 minuters underhållning och nyheter från dina serier dagligen. Kan även hämta serier direkt från Youtube och andra video-sajter. Du hittar även t.ex. SVT:s nyhetssändningar här.

Tuesday, November 07, 2006

Why the record industry don't work.

I recently read a thread in a music forum, complaining over allofmp3.com, arguing that the artists are loosing big bucks in royalites. Triggered by that statement, I just had to write an answer, and just had to write an answer.

The record industry (primarily the copyright owners) have themselves created an environment where allofmp3 can flourish. It is not the artists that makes a new cd cost $20+, and they only get a small fraction of that price. Most artists make their living on tours and concerts anyways, while the record industry is earning big bucks on the purchased cds.

Having record companies earning money off a cd-purchase might be acceptable, since it actually costs a litte to produce cd:s, but what about music purchased from the net? There ought to be close to zero material costs, bandwidth today is cheap, and server hardware isn't very expensive either. (You'll get a decent server for just over $100). So where in this lies the record-companies indisputable rights to their (very BIG) slice of the pie?

For examples:

http://www.negativland.com/albini.html <- (figures at the bottom for the impatient)
http://entertainment.howstuffworks.com/music-royalties6.htm
http://www.oreillynet.com/digitalmedia/blog/2005/12/

Anyone, please give me a service where I can buy an album, that is not over 2 years old, for under $10. Let me choose the desired format and quality, and play it on any player I choose, and I'll use it.

Property owners, be it record companies or the Hollywood giants, have always been quite comfortable where they are, earnings millions. They have always been scared by new technologies, such as the tape cassettes, the cd-recorder, the VCR etc. But somehow the ones embracing the new technologies always ended up even better than before. And somehow, DRM have never prevailed.

Monday, December 19, 2005

So Long IE (or the end of an Era)

As Google can confirm, it seems IE is reverting to the platform-centric browser it has really always been. I consider this a very good sign, with a two-fold message.

One message to all web-users that could serve as a wakeup call. Your browser is an application in your operating system. It can be chosen in an active and educated choice. Alternatives should be evaluated. Just as you may spend up to several hours choosing your shoes, maybe you should just take a few minutes to look around and see what your options are when it comes to browsers. My guess is IE would not be the result of an educated choice for most users.

However, there is also a message for web-designers around the globe. Internet does NOT equal web, and web does NOT equal Internet Explorer. The web is a networked community set by rules and convention. Remember the board of the web is the surprisingly unknown World Wide Web Consortium, or W3C. A web-site is a site that actually follows the conventions of the world-wide-web, as defined by W3C. The fact that a garble of code similar to HTML works in your "Internet Explorer" does NOT make it a website. If you take any kind of pride in your work, take an hour and go through the automated and free validation check offered by W3C. Then skim trough the page in at least some decently W3C-compliant cross-platform browser and check that things work the way you intended them to. Then and only then feel free to call it web.

Please take these simple message and get a clue about what the web is really about.