Showing posts with label not just religion and politics. Show all posts
Showing posts with label not just religion and politics. Show all posts

Sunday, June 12, 2016

Try again, Hulu, Microsoft, whoever you are....

Okay, so I was going to post some gripes a few weeks ago about how Hulu replaced the Favorites list with the Watchlist, because now it was mixing all the shows I want to watch with shows I no longer want to watch (either because I didn't like them after a few episodes or because the show had ended) all in a non-list format that was harder to apprehend in a glance.

It seemed I had to click on the shows I did want to watch every few weeks to see if there was an actual new episode, and then dig around to make sure the episode that started playing by default was the earliest unwatched episode or just the most recent one.  Turns out I only have to read the green flags in the corners of the show icons a little more closely to figure that out--admittedly there's no good way for Hulu to know if I stop watching a show after ten minutes because I don't like it or because I got interrupted--but defaulting to the latest episode after I've missed three is really bad functionality, and I still have to dig to find it how soon or how long ago an episode would have expired.  I used to be able to see all that at a glance, Hulu.  I'd understand if you were pelting me with more show suggestions to try to get me to watch more things, like how grocery stores put dairy in the back to maximize the number of people who have to walk past the largest amount of product, but that's not the experience they're giving me.

It also took me a while to figure out how to get the shows I'm done with off my Watchlist, but I'm still wondering about all the defunct shows I never got around to watching that didn't make the transfer.  Maybe I'm getting too old for technology, but I don't think that explains why Primer is still on there, but "The Aviators" isn't.

Lately I've also seen a similar change in the functionality to the programs I use at work, and it seems to be based on some aesthetic that is not the convenience of the user.  Why is it that when I edit a file and go to save it, Windows defaults the save location to the last place I saved a file of that type, instead of where the file already exists?  This was a problem developers solved in the 1980s.

And, Adobe?  You're guilty, too.  Let me turn off that infernal tool menu that pops up with every PDF I open and takes up a third of my window, and if you're going to bury commonly used functions under five levels of clicking instead of two, let me customize my toolbar and put it there, all right?

Saturday, January 23, 2016

I should never have given my boss my personal cell phone number....

He always could have gotten it from HR, but now it's all "Are you coming in Saturday?" and "Can you be here Sunday at nine AM?"

Granted, he's less demanding of my time than past bosses, but they just demanded long hours and didn't talk about it; now, if I say "No," then I'm the bad guy.

Caveat operator:  don't let them think you're negotiating casual overtime (if you're exempt) or on-call/no-advance-planning scheduling.

I even had to tell my boss once that I refused to work Sundays because I can normally get the work I am personally responsible for in fewer than six days ("You shouldn't have to work Saturdays," he said; "I know I shouldn't," I replied, "but the way things stand right now, I nevertheless do"), and if the people who "need" me to assist or cover for them a little can't get that taken care of in the 86% of the week I'm in the office, then they need to plan better.  He agreed, but he still asks.

Wednesday, February 04, 2015

I actually don't prefer to focus on recent events...

...but naturally I could not resist this.

A nine year old boy was suspended for bringing "the One Ring" to school and "threatening" to make a friend disappear.
Just let that sink in for a minute
I could see administrators suspending a child for bringing a gun to school, whether or not he understood the real threat it presented; and even a ouija board at a Christian school; but a toy replica of a fictional piece of jewelry? An obviously playacting "threat" that would have caused no real harm (except to the wearer, if the administrators had the faintest grasp of the One Ring's inherent malevolence)?  Sounds like they're the ones who can't tell fantasy from reality.
This child was suspended twice before, once for referring to a black classmate as "black" and once for bringing a children's book to school that contained a sketch of a pregnant woman.  It was for an astronomy unit, but I guess the book Nazi was asleep at the switch when the kid got the book and when, presumably, his classmates looked elsewhere in the book and noticed...well, I have no idea if the illustration seemed risqué or if there were a depiction of a coherently formed fetus that seemed pro-life or something.
No wonder they're hiding behind some confidentiality policy instead of defending or explaining their actions. Clearly this boy is perfectly normal. It's the adults who have issues. 



Sunday, September 01, 2013

The crowding out of childhood

As Mark Shea says, show me a culture that despises virginity, and I'll show you a culture that despises children.

Case #1:  Montana 14-year old Cherice Moralez is raped the first of several times by a teacher at her school, 35 years her senior.  The kind of person who had loved life and living, her mood deteriorated along with her grades, and within a month of her 17th birthday, she kills herself.  The judge, asserting that Moralez was equally in control of the situation along with her attacker, handed down a sentence of 30 days.  The age of consent in Montana is 14.  
I've heard of precocious juveniles accused of heinous crimes being tried as adults, but this is the first time I've heard of it for the victim of a crime.  The purpose of consent laws are in part to protect children who are not quite as mature as they think, from adults who think they are more mature than they are, and from adults who may be deterred by the force of law and to prosecutes adults who aren't, who would defend themselves with arguments like "She seemed older/more mature" or "she wanted to."  I wonder what this judge had in mind...you know, having actually gone to law school and all that.
Case #2:  Miley Cyrus does an obscene reprisal of a twerking video she made several months ago, showing the whole world that, just like all those girls before her who used to star in Disney sitcoms, she's not some wholesome tween anymore (if she ever was).  
Granted, she hasn't been a teen star in a while, but Hannah Montana is still her claim to fame.  I get that being typecast can kill a career, but it's not like we haven't seen this to varying degrees before (remember Fergie and Britney Spears).  In fact, working against the relatively wholesome model for the girls of America by doing something outlandish is almost a cottage industry for Disney girls, so it's sort of typecasting in itself.  Maybe the reason they keep trying harder and pushing the envelope younger and younger is audiences (the creepy old guys and the media peddlers, anyway) can always say"Yeah, we were titillatingly scandalized when we forgot that your predecessor had achieved her majority by the time she pulled a stunt like this; what are you going to offer us?"
She didn't even look that good.  She's not an unattractive woman, but they way she was done up?  Lots of skin contrasted to her androgynous mid-80s styling (which wasn't even a good look in the 80s) and uncomfortable facial expressions.  What kind of reaction is that supposed to elicit?
Case #3:  Creepy back-to-school ad campaign from some clothing store that will remain nameless because I've blocked it out.  The one where they show not just tweens but even kids with single-digit ages worrying about what kind of impression they want to make.  I’m not sure it’s bad to depict kids knowing they have some influence over how others perceive them, but it is strange to me to see kids that young being--no, acting--so savvy, and I am sure it’s not good that it’s teaching kids “other people’s opinions of me are worth worrying over,” even if it weren't just as leverage to get kids to manipulate their parents into buying them all the clothes the popular kids are going to be wearing.  Okay, others' opinions are important to the extent that you have to deal with other people, but nine year olds creating and projecting an image that you can put a label on?  That they can demand on the grounds that it will help them be more successful and likable than their parents were?  Come on.  They should hardly be recognizing it when their older siblings do that.






Tuesday, May 31, 2011

"There’s nothing reasonable about faith based beliefs," the anonymous trendy atheist said. "Faith is the antithesis to reason...."

No, it is not.  Irrationality is the antithesis to reason.  Faith is not the lack of capacity for logic or the willful rejection of rational thought and behavior.  That is not only not the whole of faith, it has nothing to with faith at all, and not even the most science-paranoid fundamentalist would insist that good Christians should always act contrary to natural thought.

Faith can be described as believing in something without having proof--and it need not be anything so thoughtless as insisting on invisible pink unicorns being the cause of rain or wind, but just something as simple as not exercising positive skepticism in the face of something that, while you may not have hard empirical data to support it, the means by which you have acquired the evidence you do have, have already demonstrated themselves to be reliable and consistent.

Unless it is logical to absolutely reject out of hand everything you personally lack compelling empirical evidence for then our friend will have to admit a closer familiarity with faith than his criticism would lead us to believe. But it's not logical to do so; we can't afford to verify everything for ourselves, and despite assertions that anyone who wanted to could teach himself quantum chromodynamics or cellular biology or Urdu or medieval law (the line implicitly being drawn at Aquinas's Summa), for some people a lot of that stuff remains every bit as impenetrable as metaphysical topics do to people who people who have no interest in studying them.

Paul said faith is proof of things unseen--the faithful act on evidence they have that is not outwardly apparent.  This is, understandably, hard to swallow for empiricists and skeptics, but what one should consider is whether this faith in the supernatural or comfortable self-delusion or psychosis, whichever it may be, what kind of effect does it have on their lives?

"Is your god supposedly omnipresent? Yes. Therefore, your god must be part of everything, else he would not be present everywhere."

Not at all.  For someone interested in logic, I'm not impressed with this one's grasp of definitions and meaning.  God being present everywhere and in all things is panentheism.  God being part of all things is pantheism.  The distinction between occupying space (all or none of it) and having mass (a little or none of it)?  Not that obscure.  It would be less inaccurate to say creation was a part of God, but it's still got a lot to be desired.

"As to choosing between animal and spiritual, there is no evidence for the spiritual. By what basis do you determine what is spiritual? Thru [sic] blind faith, beliefs without evidence. It is that kind of thinking that has led people to fly planes into buildings.
On the other hand, there have been atheists who have worked for the betterment of humanity."

Whoa, slow down.  Spirituality and faith are not the same thing, and it's a long way from "There's more to life than what I can directly sense and measure" or "I'm willing to accept some things I haven't personally verified" to "Those other guys are the enemy and we need to teach them a lesson written in their own blood."  I wouldn't even call having faith or a spiritual life to be a "kind of thinking" at all--category error.  Maybe it's too fine a point to be criticizing for sloppy thinking.

A philosopher might say that your ability to reason abstractly makes you metaphysically superior to animals, defines a chasm between you and them that they cannot cross. A Christian would say this is because you have a rational soul rather than an animal soul (which you can take for whatever natural phenomenon it is that makes something not-dead as opposed to inanimate, for the sake of the argument).  A historian would say that it wasn't theists who set off humanity's worst genocides all in the last eighty years.

But by all means, remind us that "there have been atheists who have worked for the betterment of humanity."  I don't doubt it, but that's mighty faint praise, that can be applied to unchurched charity workers and dictatorial mass murderers alike.

When you say "blind faith," you seem to mean "arbitrary and random designation."  That's not the same thing as having no interior experience to guide or motivate us to do or believe something, and it certainly isn't the same thing as having evidence that does not meet your standards for veracity.  I'm not saying you shouldn't have standards--holding evidence up to standards is part of peer review--but they help discern what data are evidence for, as well as whether data are reliable or not.  Anecdotal evidence may have vanishing utility for a physical application, but that should not lead to dismissing anecdotal evidence out of hand for all cases.

Thursday, May 05, 2011

Slightly belated, a more whimsical topic than the heavy one permeating the blogosphere this week....

...although I will point out that Osama bin Laden died on Divine Mercy Sunday (depending on your time zone, anyway).


But anyway, Thursday being the day it has been, I got to talking with some coworkers about a certain creamy condiment, and about a certain similar creamy condiment that claims to be a different food product.

Surprisingly, my coworkers were strident in Miracle Whip's defense.  "It's got a certain...tang to it."

Really?  Put me in a taste test and I think I could tell the difference, maybe even see the difference, but I don't know that I could tell you which one was which.

Maybe I've just never had particularly bland mayonnaise, or despite breathing in corrosive fumes all day long I'm still sensitive enough to spices that the allegedly tamer of the two does not strike me as decisively milder.

Yeah, yeah, maybe it's possible my nose is so burned out I can't taste Miracle Whip either, but it's always been this way for me, before going into industry, before leaving the home of my childhood that was entirely populated by nonsmokers, so I'm shunting that to the bottom of the list of excuses.

I'm thinking maybe it's just a brand loyalty thing, the way some people prefer Pepsi or Coke or RC, but at least none of those brands has the pretense to say "we're not some mere cola!"  They're all colas that merely differ by secondary ingredients, just like how Cherry Coke is still a Coca-Cola and Pepsi Blue is still a Pepsi-Cola.

I've seen and experienced so much variation in mayonnaise that it's really going to take more than branding to tell me a spade ain't a spade.  Ever try aioli?  Farther out than Miracle Whip.

Not that I have anything against Miracle Whip.  I've yet to meet an egg emulsion I haven't liked.

But anyway, just for the record, here's a basic list of the ingredients that mayonnaise and Miracle Whip have in common:


  • water
  • sugar
  • eggs (whole and/or processed yolks>
  • soybean oil
  • vinegar


Dude, that's mayonnaise.  The recipe I use doesn't call for added water, and I leave out the sugar, and I've been using predominantly or exclusively olive oil since before it was hot, but that's your baseline:  egg, oil, vinegar.  The proportions I use are generally 2 eggs to 1 cup of oil to 1 tablespoon of vinegar, plus whatever else I feel like.  Maybe mustard or sesame oil, maybe balsamic or malt vinegar.

Okay, what kind of vinegar do they use?  Probably white, if it's not specified, but whatever.

Here's where list of ingredients starts to diverge.  First, the "unique" ingredients to Miracle Whip, sans some irrelevant processing items:


  • mustard flour
  • paprika
  • dried garlic
  • spice
  • natural flavor


Keep in mind those last two.

Now, the differing (cough) ingredients in an official mayonnaise--I looked up Hellmann's because it's well known:


  • salt
  • lemon juice
  • natural flavors


"Natural flavors?"  "Spice?"   Okay, lemon juice--it's still a fairly strong acid for a food, but it'll be fruitier than most vinegars.  Garlic?  Maybe, but I wouldn't call it tangier than lemon juice.  Everything else?  It's all sausage to me.  Paprika, mustard (powdered or the condiment that also contains vinegar and turmeric), chile paste, garlic (dried or oil), it's all good.

But to me, it's all mayo.  All different kinds, but it's mayo.

Oh, before I go, a cooking tip:  instead of using butter on the outside of grilled cheese sandwiches or cooking spray for panini, spread a little mayo on the bread.  The oil will keep it from sticking and the egg will crust up nicely, and it can add a little zing to the flavor (or tang, if you choose Miracle Whip instead).

Seriously, it works.  It'll come out looking almost like French toast but you won't regret it.

Thursday, April 07, 2011

Just got home from work.  Stayed late trying to wrap some things up before I take some vacation and then stopped to talk to our second shift chemist for a little while.

Just for the record, I currently (and God willing, not much longer--your prayers are greatly appreciated and fervently requested) work for a third-party lab.  In the broadest strokes, companies that manufacture things send some to us and say "tell us what's in it" or "certify that this will meet whatever requirements our customer has," so they can go to their customer and say "Hey, here's proof from an objective third party that we've got what you need."

In the days before my tenure here, the technical people handled almost every aspect of the job:  not just testing and sending reports, but interacting with the customer to make sure they were sending us what we'd need to give them the answers they needed, quoting prices for complex jobs, even doing troubleshooting.

The chemist was telling me about a strand of manager-types who are prone to making business decisions based largely on their uninformed gut instincts.  He once was given a project that involved some relatively complex testing; he managed to get it done in two days and wanted to charge the customer about $1k for labor and materials.  One of these seat-of-the-pants managers (I can't call them all managers; one currently supervises a single room and a single employee when he's not dealing with his non-leadership responsibilities) with no background in chemistry came through, looked at the chemist's paperwork, and said "That price seems too high."  She wanted to ask the customer just for a few c-notes.  Instead, the chemist suggested she ask for quotes for similar work from some of our competitors.  The one she called offered to do it for twice what our chemist figured and said it would take 2-4 weeks.  Two or three hundred bucks weren't going to cover our expenses, but it "seemed" more in line with...with what, I don't know; obviously not reality.

So that's heinous but it brings me to my main point.  You can't be a loss leader on everything.  Sam Walton knew he couldn't make Wal*Mart have the lowest prices on every item in the store, but he also knew that he'd make more money in the long run if he'd have enough inventory cheap enough to bring shoppers in who would decide to buy other things while they were there.

So, what motivates a shopper to go to store A instead of store B?  Let's keep things simple and say A and B are competitors in the same niche and the stores are next door neighbors, so except for shoppers with preexisting loyalty, there's no preference for one over the other.

Then B says "That hundred dollar item A sells?  We'll sell it to you for $90."  Okay, sounds good, right?
But then A says "That special-order item B sells that takes a week to deliver?  We'll overnight it to you for the same price."  Now things are getting interesting--both are attempting to provide more value for the dollar, one by reducing cost and the other by improving service.  To keep things from getting complicated again I'm going to treat all "improve value for the dollar" efforts as just lowering prices.

So B's got that one item at $90.  People tend to shop there to save ten bucks.  What if B had lowered its price to $80?  Would it bring even more shoppers?  Probably; most goods do have at least a little elasticity in their prices.  What if B cut the price in half?  It would probably bring in still more shoppers, but if the managers of B weren't asking economic questions before, they now had better start asking if they're bringing in enough customers to make up the difference.

The average customer knows he would be a fool to pass on such ridiculous prices, all things being equal.  The average customer may also wonder how long B was planning on running this half-price promotion or how much the prices of everything else were going to go up to compensate, or how long B's managers expected to stay in business if they continued to pursue business volume at the expense of profitability.  The average customer may wonder, if the heavily discounted item in question were perishable or not inherently valuable enough to come with a warranty, what was wrong with it that B's managers were trying so hard to unload their inventory.

You see it at grocery stores; a few months ago I even got half a gallon of milk at a "manager's special" sale price of $0.69 that was going to expire the next day.  Usually the price is somewhere well north of a dollar for that volume, but for that price I didn't care if it was going to go sour before I got halfway through it.  Ended up lasting nearly a week; go figure.  Another bottle was undrinkable a day before its expiration date.  Guess that's just one more thing that makes this universe an interesting place to live.

For things that aren't liable to going bad before being purchased, though, how does the customer respond to attempts to make a product or service more attractive?

At what price point, then, do patrons of store B start stepping back and saying "this looks too good to be true"?  After that, when business growth starts tapering off, where is the point where customers start saying "There's no way they can do the job right that inexpensively," and label B as merely a cheap store, inexpensive with quality to match, and start shopping elsewhere because they need a better product than B appears to sell?  Where is that point where lowering prices causes you to lose business because you are no longer competing in the market you had been in--in the market you think you're still in?

These questions are not purely rhetorical.  I'm sure some economist has done studies on this topic and I'd be interested in a treatment by a mind that was better informed, more well-versed, and clearer in these matters.  I'm actually wondering if there are usable rules of thumb or some more concrete formulations for roughing out a stable range of prices for goods or services offered; finding the range between where the marginal growth in business volume starts dropping and where it actually becomes negative.  Every situation is going to be different, but if someone asked me if a 70% discount "seemed" right, or if a 45% discount from the quoted price on top of a 40% discount from the quoted price (and that one I've seen happen) seemed like a smart way to draw business, it would I think be more diplomatic to say "Well, that seems like it wouldn't quite meet a first-order rendition of Markhov's 80-20 criterion; can you elaborate on your reasoning a little?" than it would be to apply a boot to the head and then ask how many boots to the head they received before they were able to demonstrate a toddler's level of business acumen like I had just witnessed

Tuesday, July 27, 2010

"What happened to this generation?"

I've been watching Fringe on Hulu.com.  I've been enjoying it--interesting premise, good enough stories, and characterization the way characterization is supposed to be done:  varying complexity and occasional moral ambiguity without relying too heavily on the "bad people arbitrarily designated protagonists" crutch.  Sometimes there are quantum leaps of deduction used to keep the plot moving in the right direction, but considering they're usually made by the resident lobotomized psychotropic-drug-using genius on a show where such insights can even be plausible, I'm willing to overlook the flashbacks to the early days of sf television.

I didn't mean to dwell on the show.  I just wanted to comment on a scene in an episode from a couple months ago.  The aforementioned genius, Walter, was taking a break from whatever work he was doing in his Harvard lab, and was sitting outside watching students go by and smoking pot with a colleague, Nina.  The following exchange takes place:




Nina:  "I forgot how serious this campus has become.  I remember my time here quite differently."
Walter:  "We did have fun, didn't we?  I don't know what happened to this generation."
...
Walter:  "Look at all these students.  When did they become so afraid?  We had the courage to think against the grain of what we were told; we let our curiosity be our guide."




What happened to this generation?  Walter and Nina did.  Walter's generation looked at the one that invented the bomb, decided to live for today (how that constitutes letting curiosity be their guide, and how what Archimedes and Galileo and da Vinci and Tesla and Einstein was something else, I can't imagine), and he himself went down a path that broke this universe, and the one next to it.  Maybe this generation finally learned that some caution is appropriate in this life, after all.

Here's the problem I see.  The "courageous" generation that preceded a "serious, afraid" one rejected what they were told, instead of simply playing devil's advocate like an honorable curious skeptic; but then they turned around and tried to teach their successors that this parochial truth of relativism was the One True Way.  Nina at least seems to be truer to her principles:  if the truest thing you can do is throw off Truth, then you shouldn't be scandalized when the  people you try to teach would not scruple to question, doubt, and reject the things you turned out to be taking as absolute after all.

Never mind about what Walter considers to be courageous.  Maybe our apparent seriousness and fear is just what prudence looks like to him.  Prudence is a virtue.  Too bad the consequences of his lapses in prudence were being shared with everyone--with the fearful and serious students he was watching with Nina, and with everyone and everything else known to exist.

But it's just a TV show.  Maybe I shouldn't think too much about the words that the writers are putting in the characters' mouths.  But that's begging a question.

Friday, June 18, 2010

Pizza

Had some 'za for dinner. Got a cheese pizza from the freezer section at Wal*Mart, having forgotten about the crust and sauce left in my fridge from an earlier attempt, and threw on various toppings.

I got to thinking: I like tangy things on my pizza. It makes drinking cool, often sweet beverages in between bites that much more refreshing. I like green olives (black ones I can't enjoy except in trace amounts, as with a potent spice, and I'm just happy not to have them at all), hot peppers (yellow ones being pungent and sweet and not too hot if I'm not in the mood for recreational oral pain), even sometimes onion and tomatoes (which get nice and tangy when cooked down, or when already sun-dried).

What I didn't think about until just now is "If onion, why not garlic?" I've had it as part of a white-sauced pizza, like the flatbread equivalent to fettucini alfredo, but never as an addition to more conventional pizza. But I didn't start this post just to get an idea for getting sidetracked.

What I thought about when I was rounding out my pizza fixings was this: "If olive, and tomato, and pepper...why not pickle?"

It's tangy, salty. Should blend right in with olives and pepperoni and cheese, right?

Well, no twist from me tonight: it actually worked quite fine. I approve of dill pickle slices on pizza. I may be trying other kinds in the future.

Your milage may vary. Let me know what you think if you try it.

Tuesday, December 08, 2009

I learn a lot about good and bad driving by observing the mistakes of others. A lot of it's about the apparent weaknesses in my new state of residence's driving laws and standards for instruction, so I have plenty of opportunities to learn as well how much humility and patience I still need to develop, but it also gives me things to watch out for in case I'm ever a lot closer to the action.

I also learn a lot by being a stupid driver myself. It's a lot easier to see what my weak areas are than to try to imagine them because my commutes are so mundane that my margins of safety are hardly touched.

I think I learn the most, though, when I see other drivers react badly to my mistakes. The funniest one was when I lost control on some black ice and slid off the road. Did some minor damage to the exhaust system. While I was inspecting it, my dad called the police. A cop came out, seemed satisfied that I'd done everything I could and should have done so he didn't give me a ticket, but then asked me to get in the back of his squad car so we could get the accident report taken care of out of the weather. He wanted to get through it quickly, he said, before some rubberneckers came by and hit something. Sure enough, while I'm looking over the form, a car drives the other way slowly with the driver just staring at me in the back seat, and another car does the same thing, only not quite as slowly...*bang* I laughed, the cop uttered a mild expletive, and asked me to wait while he got the other two cars situated. Then he came back to finish with me, call out another car to deal with the two cars that collided, and called the city to get a brine truck out. Man, I still laugh about that.

There were two other very similar ones that I didn't realize I was also guilty of but still don't quite understand why it seems natural for people to do it. On different occasions, I have been backing out of a parking space when someone drove behind me, either because I didn't check a blind spot or they came around a nearby corner and feared I wouldn't look in time to avoid hitting them. Maybe I would have, maybe not; usually I catch that kind of thing but I appreciate being honked at so I can stop immediately and reassess my surroundings.

On the two occasions I have in mind, though, the cars honked to get my attention and then stopped right behind me. When I looked around, I just saw a car there, the driver watching to see what I would do. What do you want, a contrite gesture in my rear view mirror? I'd be happy to oblige, but my windows are tinted. Waiting to see if I've noticed you yet? Either stop before you get in my way or try to scoot past, whatever it takes to avoid a collision. If you're choosing between letting me slowly back into you and running over a pedestrian on ahead of you, hey, good choice, but dinging a fender or bumper would be much less significant than crumpling a door and possibly the person sitting on the other side of it.

I can't say it's not a natural reaction. My instinct is also to try to stop to reduce the variables I have to assess in order to safely defuse a situation. But man, staying right in the path of a moving car is just bad news.

Monday, October 05, 2009

Remember when they started doing away with pop-up ads because everyone hated how their desktops got so cluttered?

I do.

Yet, somehow, they're back. Now they're worse, though.

They're embedded in the page you're reading. They pop up when you hover over or move across a link. Often, they don't go away when you move your mouse. It seems that, sometimes, they're not supposed to.


Please, web designers: knock it off.

Thursday, September 03, 2009

Since I don't have to talk exclusively about religion and politics here...

I found some blog a while ago, just long enough I don't remember whose it was, that was talking about Macs and PCs. The thing I recall was a comment that went something like "Macs are okay, once you get past the self-important posing and puffery, but when you really want to get down to computing, you'll buy or build a PC."

Really?

No, I'm sorry, but "reinstalling drivers" is not the same thing as "getting down to computing." I hope that $300 you're saving by getting an Inspiron at Best Buy is worth the downtime.

Friday, July 27, 2007

Our disposable economy

I was at the municipal airport some time ago and heard one of the employees--maybe the terminal manager, for all I know--talking with the local flight instructor. He just happened to be complimenting the instructor on the decent new chairs he got for his office. The last thing I heard him say on the subject was about how they were only $15 so if they broke, replacing them would be no great task.

Naturally a year or two later--not so long that I couldn't remember overhearing that conversation well enough to think it would be funny if my life were a movie--I broke one of them, and it made me think of all the other things I own, or even just see around, that aren't meant to be repaired but simply discarded or maybe recycled. I've been told in the past that it's tantamount to a conspiracy among manufacturers to produce things that are prone to breaking and designed to be unrepairable by the average (or even the average mechanically inclined) person, so that they may be sure of selling more of the same things in the years to come.

There's probably some subscription to that philosphy, but I don't think we have to worry about the Shoe Event Horizon anytime soon.

First of all, it's not entirely true. Some of the more durable goods do tend to be pricier and harder to find than the cheaper ones, and they all may even be harder to repair when something does go wrong, but there are still things that work more reliably than their predecessors. Some of the Total Quality Management initiatives even rise above their own bureaucracies and enable an improvement in product quality by demanding better documentation of manufacturing processes and by documentation and standardization of methodologies, which make systemic problems easier to recognize--and recognized problems are vastly easier to solve than invisible ones.

Automobiles may be the quintessential example. I was going to say personal computers, but I think that fully integrated, non-user-serviceable configurations will become more popular, especially since the available computing power in more conventional tower designs, which are more suited to low-level tinkering, is really starting to get beyond what most users will need. Automobiles have become much more sophisticated, in some ways being more difficult to work on, but as long as there are gasoline engines and gearboxes and other powered moving parts in a motor vehicle, some home maintenance may always be possible. Since so much of the car is electronically managed, there are many things that just aren't up to a professional or amateur mechanic, but diagnostic computers can make the job easier when there is something for a human to do. Further, cars are lasting much longer; it's now expected for a car to operate beyond a hundred thousand miles, to the point where some manufacturers even have six-figure milage warranties, and some have been recorded as running several hundred thousand. Fifty years ago, when people could easily find TV repairmen and cobblers in their home towns, they could also find mechanics, but few or none who could work the magic it would take for a typical car of that time to last until the odometer rolled over.

Cars aren't simply better built, they're more complex, with much more going on under the hood and behind the dash than there was even when I was a child. It's barely adequate for a modern mechanic to rely on the wrench in his hand and the knowledge in his head.  It's just more sophisticated work than it used to be, even with computer assistance. I think what's true with mechanics is also true, or at least thematically consistent, with my main point.

Most manufacturing today is automated to some degree. You'll need a few engineers and technicians to baby-sit a plant, and some trained but possibly unskilled people to operate the equipment, but compared with fifty years ago, you need more gearheads and fewer laborers. There's a net cost saving in manufacturing because the payroll got short enough to offset the education of the engineers, but what about repair and maintenance?

Well, that kind of work is still labor intensive, and whether or not a toaster or a radio is made to last, if you can get it open, it takes a lot more than a screwdriver and persistence to get it working again. Computer repair might be more of a cottage industry than traditional repair services, but even then, what usually happens is a component is replaced and then thrown out or recycled for materials. Circuit boards and such are so touchy that trying to manually replace diodes or capacitors, if the malfunctioning ones can even be identified by your average soldering gun wielding citizen, is most likely to multiply the problems.

The result is that labor, on average, is more expensive than it used to be, in fact is by and large too expensive to spend on small appliances, even if they were designed to be accessible to curious gearheads. Even if they still used the older and more durable designs, it'd really be less expensive to replace something than to hire someone (not just buy a replacement part, unless you are willing and able to do it all yourself from home) to take it apart, diagnosis it and install the parts.

I do occasionally see a durable good designed for recycling--choice of easily reused or compostable materials, ease of dismantling and sorting into material types--but that still seems to be the exception.  Or maybe the recycling outfits hired by cities haven't caught up yet with green cradle-to-grave product engineering yet.