|Industry needs an agreed-upon set of BoM characteristics or fields.
The International Electronics Manufacturing Initiative white paper The Perfect BoM
Today’s WhatIs.com Word of the Day is bill of material (BoM). It’s basically a recipe for a product. A small company with a simple product — like a bookcase — might use an Excel spreadsheet to create their BoM. A larger company with a more complex product — like an automobile, for instance — needs a special BoM software application.
The International Electronics Manufacturing Initiative is trying to promote the idea that BoMs should be standardized. Note the word “International.” It’s new.
To be “perfect,” the BoM should include everything that goes into the product, from raw
materials such as wire, tape and solder paste, to the box that will be used to ship the product.
It should make parent-child relationships clear, differentiating between components and materials that are part of a sub-assembly versus the overall assembly.
For example, information about programmed parts is typically structured differently from BoM to BoM, and is often open to interpretation. The Perfect BoM should include blank parts as well as the software required to program the blanks, indicating the relationship between components and ensuring that all necessary parts and data are provided.
|It’s important to note that the gap analysis is not a one time activity. Each organization should execute a gap analysis of its cybersecurity approximately once per year and draw upon the results to adjust cybersecurity activities to meet new regulatory or compliance requirements or simply growth of the organization and its supporting information technology infrastructure.
From the book Cyberwar, Cyberterror, Cybercrime by Julie Mehan
Today’s Word of the Day is gap analysis. I think Dr. Mehan is the only expert I’ve read who says “do a gap analysis once a year.” I love it.
General Motors and Segway unveiled the PUMA (Personal Urban Mobility & Accessibility) at a press event before the 2009 New York International Auto Show. Very cool!
[kml_flashembed movie="http://www.youtube.com/v/UIrAlPFb8RY" width="425" height="350" wmode="transparent" /]
[kml_flashembed movie="http://www.youtube.com/v/qY4msj5Q05Q" width="425" height="350" wmode="transparent" /]
[kml_flashembed movie="http://www.youtube.com/v/bowXU9gvAN4" width="425" height="350" wmode="transparent" /]
|“What I find in software projects is something called the invisible line. At start of the project, we’re all working for collective success. But then a line is crossed somewhere in the project and I’m no longer working for collective success. I’m working for the avoidance of individual blame.”
Ken Thompson in an interview with Robert Scoble
Today’s whatis.com Word of the Day is bioteam. Software engineer Ken Thompson came up with the concept after looking at how biological teams in nature communicate to achieve goals.
There’s no blamestorming in a beehive — no politics in a pelican flock.
When you see Canada geese migrating north, you’re looking at a little P2P network. The geese take turns leading the V formation because distributed leadership gets them further than following just one alpha goose. Each honking node on the network has the power to be a client and a server.
The flock itself is a living organism, just like the individual geese that make it up. The members stay in constant communication, doing whatever needs to be done to keep the flock healthy. There are no job descriptions in a flock. Each goose says says to the flock, “You need someone to honk? I can honk.” or “What, now you need me to lead? Sure I’ll lead.” or “I think somebody needs to stay behind with old grandfather goose — I’ll do it.”
It sounds very efficient. Why can we humans be that efficient?
As Ken points out in his quote above, we usually start out software projects with good intentions. Do we end up in survival mode just because total success is so very rare? Is every project like the beginning of tenth grade, where we start out the school year with fresh notebooks and daydreams of straight As, only to end up at mid-terms satisfied with a C?
I wonder. If the flock hardly ever made it back to Canada, would they still continue to operate in a P2P mode?
I’m not sure — but Ken became convinced that there really are lessons to be learned from this distributed leadership model in Mother nature’s repetoire — and I think he’s on to something. To learn more about bioteams and self-directed virtual teams, I recommend you visit Ken’s blog, The Bumble Bee.
Yesterday I wrote to him to ask if it was fair to say that Wikipedia was created by a bioteam. Here’s his answer:
“I would say a definite YES given the definition of a bioteam as:
1. The group is not co-located and may only occasionally meet physically – in fact sometimes all the members of such a group never meet physically.
2. No single channel (e.g. email or web) suits the communications of the entire group – this may be a by-product of the first point but can also be a function of personal group member preference.
3. The group has fluid and/or complex structures such as groups within groups, groups within communities, overlapping group memberships or different types/levels of group membership.
4. There is no obvious single point of command – there is no single leader with the authority to command the entire team and leadership must be implemented collectively. If somebody says “working with these guys is like herding cats” its often a clue.
5. The group has to be formed via an incubation process over an extended period. Its growth looks very similar to that of an ant colony or beehive which are both exceptionally vulnerable until a critical mass is reached but almost indestructible after this point. This is in total contrast to the traditional (command and control) team which usually starts at its strongest but weakens quickly over time.”
I raise guide dogs, so when I’m challenged by how to work more effectively with my distributed, growing and ever-changing team here at TechTarget, I naturally think about about pack management and the role of the alpha dog — but I have to admit, Ken has extrapolated some useful management guidelines for self-directed teams, even if he does compare working with a bioteam to “herding cats.” : -)
|802.11n was developed as a range and speed booster, employing multiple antennas and two or more radios to work over greater distances (sending a stronger signal, having better receiver sensitivity) and at greater speeds (improved encoding, multiple spatial paths, double-wide channels). That’s fine for laptops, desktops, and routers, but it’s hard to cram that much radio technology into a battery-powered mobile device without making the time between charges unusably brief.
Glenn Fleishman, Does the iPhone Need 802.11n?
That’s where single-stream 802.11n comes in. With single-stream 802.11n, only a single radio and single antenna are used…
…802.11n’s single stream encoding is 65 Mbps, where 30 to 50 Mbps of throughput is possible. So you lose wide channels, antenna diversity, and multiple streams, but could gain 50 percent or more in net throughput.
|Streaming is the better solution when your clips are more than a few minutes long, when you want to enable interactive applications like video search or linking deep into a file, or you want to collect statistics on what’s actually being watched.
Larry Bouthillier, Streaming vs. Downloading Video: Understanding The Differences
Streaming is the way to go when you want to control the impact of video on your network, or when you need to support large numbers of viewers. And of course, it’s the only way to do live webcasts and multicasting.
|When you type a command in Windows PowerShell, you are invoking a specific, small-scale object that has a very specific purpose. Yes, you can invoke command line applications, too. You can also invoke GUI applications. But your old DOS batch files won’t run anymore. Why? Windows PowerShell is not about text processing, it’s about object handling.
Payton Byrd, What is Windows PowerShell?
|The Internet engineering community says its biggest mistake in developing IPv6 – a long-anticipated upgrade to the Internet’s main communications protocol – is that it lacks backwards compatibility with the existing Internet Protocol, known as IPv4.
Carolyn Duffy Marsan, Biggest mistake for IPv6: It’s not backwards compatible, developers admit
I just finished reading Carolyn Marsan’s piece, Google: IPv6 is easy, not expensive. Google, you see, has moved to IPv6. IPv6 operates in much the same way as IPv4, but with one very important distinction — IPv6 assigns IP addresses of 128 bits instead of IPv4’s 32 bits and that really increases the total number of possible Internet addresses.
Carol reports that Google engineers worked on the IPv6 effort as a 20% project – meaning it was in addition to their regular work. So far, the U.S. has been lagging behind other countries with IPv6 — mostly because we can — but now that Google has come on board, that may hurry things along.
What really caught my interest, was a sidebar that linked to an article where Carol did a bang-up job explaining the REAL issue that is holding up IPv6. Simply put, the developers mis-judged how adoption would really occur. They may have shot themselves in the foot by not making IPv6 backwards-compatible with IPv4, but It’s not their fault — it would have been cost-prohibitive, complicated and illogical in some ways.
(Explaining it to myself) It would be like your television station broadcasting in both analog and digital for awhile and then gradually fading out analog broadcasts as people replaced their old analog sets with new digital ones. It seemed like a logical, practical plan.
The IETF developers designed IPv6 to run in a dual stack. That means that IPv4 and IPv6 would run side by side for awhile and then IPv4 would gradually be faded out.
They didn’t foresee a scenario where IPv4 devices would stick around for years, some vendors wouldn’t bother upgrading their products to be IPv6-compliant and some administrators would just shut off the IPv6 part of the dual stack in an effort to keep things simple.
Since IPv4 isn’t fading away as the engineers had thought, they are going back to the drawing board to help IPv6 addresses be understood by IPv4 devices.
Carol says the transition mechanisms include:
* Dual-Stack Lite, a technique developed by Comcast that allows for incremental deployment of IPv6. With Dual-Stack Lite, a carrier would give new customers special home gateways that take IPv4 packets from their legacy PCs and printers and ship them over an IPv6 tunnel to a carrier-grade network address translator (NAT).
* NAT64, a mechanism for translating IPv6 packets into IPv4 packets and vice versa. A related tool, dubbed DNS64, allows an IPv6-only device to call up an IPv4-only name server. These two tools would allow an IPv6 device to communicate with IPv4-only devices and content.
Which is correct?
_________ installing the patches?
a. Did you finish
b. Have you finished
|The marketplace has not been especially kind to Xen for two reasons: it was not first to market, which is an important factor for any industry, and Xen resellers do not have the power of the VMware PR machine.
Schley Andrew Kutz, Xen: An endangered species in the virtualization ecosystem?