Dedicated Z-wave sites?

That's gotta be on par with a bassism. kinda hard to get any experience without the license though isn't it. Seeing as it's illegal to do that kind of work without one. Plus the guy with the license who is in that business will warranty his work if he wants to continue to get any in the areas he works in. So who do you hire, illegal aliens, alcoholics. crack heads, anybody off the street and hope they know what they are doing? Anybody but a licensed electrician? People who claim " I am not an electrician but I play one on usenet? and I can tell you how to do it in a phone conversation in 20 minutes" Good luck with that. YMMV

Reply to
Frog
Loading thread data ...

I absolutely will take the advice of someone over the phone, evaluate it, and determine if it's a job I can do myself. As the homeowner, that's my right. No way I'm going to pay someone to do something I can do myself or can do with the help of an experienced friend.

Reply to
E. Lee Dickinson

I dunno. I'm watching a vidcast of a briefing called "Improving the Reliability of Commodity Operating Systems" and it talks about how much of Windows resides outside of MS's control in device drivers. If the lecturer is to be believed, 85% of Windows crashes are device driver related. He's discussing an OS helper called NOOKS

formatting link
that seeks to protect the kernel of Linux from errant drivers usings a variety of techniques like isolation, object monitoring and shadow drivers (that try to restart failed drivers and replay inputs to recover what was lost by the driver crash). He notes that the problem drivers that escape into the real world often suffer from transient faults that are difficult to find. If merely restarting the driver overcomes the problem that often means the driver crashes are occuring when interrupts come at precisely the wrong time.

But you still have to deal with device drivers, don't you? That's where the failures are located. IIRC, in the Linux study, the most failures were caused by sound card drivers at 40+, then network drivers, then IDE drivers, with only eight reported. That's what I would expect. Your driver wouldn't last long if it corrupted data, but if a certain slider on a volume control was twitchy, a driver might still survive without being fixed.

Maybe you're doing a lot of what NOOKS does already in Charmed Quark. It's a lot of coding work to provide the sort of encapsulation and error recovery that NOOKS does. It also costs CPU cycles - sometimes twice the normal load.

-- Bobby G.

Reply to
Robert Green

because of your unfortunate criminal record. Gee

You need to understand that "Frog" is just another alias of one of the ASA idiots. He's lying (of course). Nothing in Florida statutes prevents me from getting an electrical license. However, I wouldn't take the pay cut. I installed security and HA systems for 24 years and I've been in the trade just shy of 30 years. The ASA lunatics hate me for talking about the ills of the industry and for offering DIY end users ways to circumvent alarm companies. For some reason that upsets them. :^)

I held an electrical contractor's license for many years without a single complaint until the guy who calls himself "Group Moderator" (a competitor named Michael Sabodish who hangs out on ASA mostly) filed a false complaint. It was dismissed as baseless. Now he spends his time filling in complaints to the BBB and reporting the "count".

Reply to
Robert L Bass

I dunno. I think that you are maybe stressing out over something that's not nearly as big an issue as you are thinking. Yes, there are crappy devcie drivers out there. And if you configure a machine with a fairly random set of hardware, you can have problems. OTOH, there are quality products out there that work, and they become known well enough. A machine that is set up with good quality hardware and drivers, which isn't used as a daily use machine (i.e. it's configuration is not changed and web surfing isn't done on it and things that aren't needed are turned off in the OS, i.e. a standard kiosk style touch screen client or a server in the closet) can be stable for years without problems.

-------------------- Dean Roddey Chairman/CTO, Charmed Quark Systems, Ltd

formatting link

Reply to
Dean Roddey

That's a lotta futzing and "kid gloving." It likely means no one will be plugging IP cams and ethernet switches into that same network which could limit usefulness in a big way, at least in terms of HA.

-- Bobby G.

Reply to
Robert Green

I'm not sure what you mean there. A computer isn't going to self-assemble. Either you are going to buy one, where someone has already done this work to find the good components and make sure that they are high quality, or you are going to do it yourself. Most people will probably just buy, so most of the 'futzing' is typing in the credit card number.

And it doesn't mean that cameras cannot be plugged into the same network. I'm not sure where you drew that conclusion from. Now in a high end system, they may well choose to have a separate general use network and a dedicated A/V and automation network. It's not an unreasonable thing to do if you want the ultimate in stability and the cost isn't an issue. But for most folks it's not going to be necessary.

And the same issues would apply to any networked automation system that doesn't use a proprietary network, PC-based or not, so I'm not sure I see how it would be a ding against PC-based systems specifically. I don't think many systems will be using much proprietary networks moving forward.

--------------------- Dean Roddey Chairman/CTO, Charmed Quark Systems, Ltd

formatting link

Reply to
Dean Roddey

Bobby,

I don't understand why you would say this. One could kill a network in a variety of less esoteric ways -- for instance shorting two wires in the CAT5. What does this have to do with the stability of an HA server? If (eg) an IP camera breaks a newtork, the problem is with the camera.

It is conventional IP Best Practice to allocate servers on an at-least- one-per-function basis. Why should HA servers deviate from what are established best practices? An HA server is a discrete function. It is sound practice to allocate at least one CPU to it -- not "kid gloving".

One can over-tax any machine ever built, whether by throughput, CPU cycles or complexity of simultaneously running software. I can say from practical experience with several different mini-ITX machines (which you have expressed an affinity for) that they can run out of CPU cycles with only a few HA tasks -- trying to also simultaneously run general purpose computing or networking chores can bring them to their knees long before software interactions become a major problem.

... Marc Marc_F_Hult

formatting link

Reply to
Marc_F_Hult

^^ IT

Shoulda been IT, not IP

... Marc Marc_F_Hult

formatting link

Reply to
Marc_F_Hult

How can an end-user evaluate the "goodness" of a component without extensive experience? I've had tremendous driver problems with gear from big name companies. When hardware breaks new ground, as is often the case with high end video cards, it sometimes tales quite a while for even the industry giants to ship stable drivers. I've seen badly set-up machines from Dell, HP, Gateway and Compaq and others.

What do CQ users do when one of the updates or bug fixes that MS is pushing out all the time breaks their setup? When that happens, don't your clients call you? What I am trying to say is that if you build/setup a system so that there's no web surfing, no frequent SW changes, etc. then you're really limiting the expansion capabilities of the setup. In other words, what you need to do to make Windows run reliable is to keep it in an isolation ward after birth and only change things when there's no other choice.

What would worry be most about your product, if I were in your shoes, is that if MS decides it's the wave of the future, they'll just take what they like and put it in the next edition of media center. When they decided to put networking in Windows, it just about killed Novell dead. When they decide to take on HA/HT integration, the same sort of scenario will play out.

-- Bobby G.

Reply to
Robert Green

I'm failing to understand what you are driving at. The fact that it's possible for problems to occur means that no one should use computers? If you are an end user and you don't feel competent to evaluate the components, then you buy. If you happen to be one of those rare cases where Dell or HP ships a screwed up configuration, then well that happens. Every company occasionally blows it. What does that have to do really with automation?

I don't know of any examples of this so far, other than their occasionally cranking up security on some things in order to close security holes. This has, once or twice, caused a particular thing to stop working (access to a media repository directory.) But it's been very rare that any problems have occured, partly because of the points I previously made about our rolling our own where possible.

This is an automation system, not a daily use web browser. People buy our product to create automation systems. Most of the 'computers' are, to the end user, nothing but a touch screen on a wall. They are not for daily use and are not intended to be. THey have no keyboard because they are kiosk style touch screens. So this is really a non-issue.

You can run our client tools on a regular computer of course, and people do that on their home office machines, for instance, to have access to the automation system from their office. But, it's just client tools. If anything goes wrong with that machine, it just affects that machine. The guts of the automation system are in a locked down machine in the closet.

You have to understand the automation market. It is 'comlexity limited'. MS could add automation features to MCE if they want. But that's not going to suddenly make customers able to install all the specialized hardware to create an automation system. They may well come out with some very simplified stuff, but that's not at all the market that we are in. There's just no way that MS is likely to get into the true custom installed automation market. It's so far off their radar that it's not something I worry about.

Creating a real automation solution requires a level of technical expertise, completely outside of the actual device controlling it, that will for the bulk of people require a custom installer to do the setup. Those custom installers, while they are of course happy to integrate MCE into the overall system as a media client, aren't very likely (for the reasons discussed in this thread) to install a comletely MCE based automation system. Any serious automation system will have a server component to keep the core automation control mechanisms away from such daily use/media-playback oriented machines.

--------------------- Dean Roddey Chairman/CTO, Charmed Quark Systems, Ltd

formatting link

Reply to
Dean Roddey

news:jlhgh.24869

It's got to do with why HA programs like CQ are slow to gain acceptance. I just bought a Panasonic Netcam from Smarthome. I want to plug it into my system and email pictures to my cellphone. Wouldn't the natural place to install such a beast be the HA server? If a net cam has a *really* bad driver, it's going to blow up CQ. For CQ to be attractive to me, it would have to implement some of the features that I referred to in the NOOKS citation. Many people cite the decline of Homeseer reliability as coinciding with the heavy reliance on "plug ins" - really another form of device drivers. Something I saw today said that 6/7ths of the existing Linux codebase now consists of device driver code.

If we are to follow best practices, it now sounds as if we're talking a client to go along with that server. The costs of the dedicated HW alone are now starting to exit the home market price range. But Dean says he's not targeting that market, so maybe it's no problem for anyone needing CQ's features. If X-10 has taught me anything, it's that price matters. A lot.

Business practice, sure. Home network practice, not so sure.

Server farms have a very low SAF. :-) So, IMHO, does more than one PC per room. Compromises are far more likely to occur in home installations where people aren't likely to be IT pros.

I dunno. I have plenty of machines that serve more than dual purposes. Were I to strictly devote one PC per application, I would be running a server farm. While it may certainly be best practices for a business, size and money constraints often dictate that more than one application is going to live on a server, especially on a home network. It's been my contention that while it should be possible, it's usually only achievable by a PC guru that can configure a server the way Perlman can strum a violin. Good software is able to interact with other good software without bringing down the house. The more a program has to rely on isolation to function properly, the less appeal it will have to people.

"Can bring" is correct but should such chores choke a 1GHz CPU or is the software bloated, inefficient and buggy? Dean says, and rightly so, that you can find the perfect mix of peripherals, PC, apps and OS. The questions I have are who's finding it, how do they recognize unless they are PC experts and what is the bottom line cost?

One certainly can overtax any machine ever built but the high end standard today is 2 cores and 2GB of memory at 2GHz clock speeds. That should be able to handle some pretty serious applications - concurrently. From what I've been reading, 4, 8 and 16 multicore chips are in the pipeline although the MS OS's are woefully ill-equipped to handle true parallel multiprocessing. But that's beside the point.

My concern is that an HA server that has to be sequestered to operate properly is by the very nature of isolation, going to be that much hard to connect to plug in Ethernet appliances, IP cameras, daily on-line weather, traffic and email data and all the other devices that are in the pipeline now. Lots and lots of "appliances" are coming with RJ-45 jacks and embedded servers to allow PC configuration. If I'm running an HA server and new HA products come on line, it seems the HA server is the place they should go. If there server is only going to stay stable through careful selection of HW and SW additions, how does the end user evaluate the quality of their drivers? What are the security implications? This all rapidly gets so esoteric that all but the faithful roll their eyes and sit back and wait for something simpler and cheaper from Microsoft.

Let me ask you this. Best IP practices dictate grandfather, father, son rotating backups, some of which should be stored off site, all of which should be tested to insure accurate data restoration. How many businesses do you think do it? How many home network administrators? There's the ideal and theoretical world and then there's the jungle of the real world.

-- Bobby G.

Reply to
Robert Green

FYI, one of the advances touted for Windows Vista is a new driver model which pushes most of the driver code outside kernel space, making it more difficult for a badly-written driver to crash the OS. The upside, of course, is that it should make crashes less likely. The downside is that many drivers need to be rewritten, and some never will be - there's little incentive for manufacturers to support stuff that no longer contributes to their cashflow (the two-year-old Toshiba tablet PC I'm typing this on is a particularly depressing example :)

- Dennis Brothers

Reply to
Dennis Brothers

According to the NOOKs guy, that's a part of the solution - the part he labels isolation. I was surprised to learn how much kernel access drivers have in both Windows and Linux. He talked about drivers currently being able to corrupt data structures in the kernel so severely there could be no recovery. Even with the monitoring program certain fatal errors were unavoidable and usually indicated a very severe coding problem.

There were also some other pretty interesting modules, like recording input in a spool file so that it can be "played back" if a driver has be restarted. A frank review of the typical driver creation process confirmed what you've said as well. There's little incentive to write these things in the first place and there's almost none to support devices that are jnot being sold anymore. Ironically, this often leads to a lot of drivers being written by earnest but inexperienced coders just to bring a device into operation on a later OS than its "birth" OS.

-- Bobby G.

Reply to
Robert Green

Anyone that knows how the BBB operates would know that what you've just said is a blatant lie. You can now add "moron" to the list.

Reply to
Frank Olson

to deliberately misconstrue, misrepresent and (not too

OK... I've read far enough... How is what you've said about Dave any different than what you continue to do... I'd really like to know.

at a time when

You see... That's where the "he said", "I said" thing comes into play. Dave seems to think that there was NO appeal for help. He's actually quite adamant about that point. Now, I know you don't "read well". Could it be that he's right and you're wrong? Think about this a minute.

Now there you're definitely wrong. If you had no "room for hatred" why continue to fan the flames in ASA with your little "catch phrases"?

Now that's a blatant lie. Where has Dave been dishonest? Prove it Bass. More "baseless accusations" won't help here. You are NOT in ASA.

Reply to
Frank Olson

The 'micro-kernel' concept has been around for a long time. The problem is that having all the drivers (particularly video drivers and media drivers, which have to move a lot of data) outside of the kernel requires a level of performance that wasn't previous there in PCs. I believe that the original NT had a fairly strict micro-kernel design, right? That wasn't an issue for video when it was a server operating system running a text terminal. But when it's a media machine, as XP is commonly used for, it's an issue, and I believe that they moved away from that architcture for completely understandable and practical reasons.

Given sufficient hardware oomph, and probably some amount of support in the CPU/chipset, it's probably a lot more reasonable to do it, and it makes sense to do so as soon as it's possible. And it sounds like Vista is where MS decided they could afford to do so.

But, this is not required to make a very reliable automation server. That's doable right now, as proven by the many users who have done so.

Reply to
Dean Roddey

I'm still confused. You are kind of saying, if I stick a crappy piece of hardware in my car engine, it could fail. Well, yeh, it would. So don't stick a crappy piece of hardware in your car engine. Problem solved. If you insist on buying junk, I don't think you should be suprised at what happens. If you are first person on the planet to buy such a camera, then I guess you will have to try it yourself and if it turns out to be junk, send it back. Else, do a little research and find out what works, since you will seldom be the first person to do anything.

As to CQC being slow to gain acceptance, this is just business reality. Most new products are slow to gain acceptance when they are entering an existing market that has settled into a pretty solid pattern, and they are trying to create a new pattern. The market, for us, is the professional market, and the gatekeepers of that market are custom installers. We have to convince them that both the product and the company are solid. That is now starting to happen, given that we've been improving the product rapidly for years now, and have put out a very solid 2.0 product. But of course they are (understandably) conservative and tend to stick to what they know, so the process is always slower than we'd hope. But it's happening. We've had a over 5x increase in systems shipped this year, and I think it'll be more than that next year.

Homeseer allowed people to implement plugins using third party languages and to do whatever they wanted. This is not the case with CQC. Drivers are written in CML or PDL, which we control and which are highly strict languages that very much limit the ways in which a driver can destablize a system. There's reall only two ways. They can into an infinite loop, or they can eat up memory without stopping. Both of these can be found with a fairly trivial amount of testing and user beta testing.

Now that allowed users to extend HS in a lot of ways, but the price is loss of control. We've often had to do a lot of arguing that forcing the use of CML/PDL was improtant for just this reason, even if it meant that many new features must be provided by us, not by third parties. I think that this conservative position has proven out. We just did our 2.0 release and it went very smoothly for a major version upgrade with huge product improvements/changes. I've always taken this conservative approach that, at least in our target market, no amount of functionality is going to sell if it's not solid.

Plenty of our users use their servers for more than one thing without any problems. They just don't throw junk onto their systems. Plenty of them are using the same machine as an automation server and media server, for instance. It just requires that you do some due diligence and have a burn in period for any bits that you are going to use.

And the same would apply even if you are using a completely solid state controller because the controller is in turn controlling devices, which could possibly have errors in their protocols (and they sometimes do.) So of course you must test out your configuration and give it a real world break in period and deal with any issues.

This is not something that most users will do themselves, as I said. So most folks will hire someone to do it for them. That person, one would hope, is not lying about his/her experience and will be qualified to insure that it all works together correctly. In most cases, where possible, the installer will use components that experience has proven are solid. If the user insists otherwise, the installer will probably charge them more to cover the extra work required to validate those unknowns.

--------------------- Dean Roddey Chairman/CTO, Charmed Quark Systems, Ltd

formatting link

Reply to
Dean Roddey

No, but you're certainly cyberstalking again.

Reply to
Bill Kearney

No, I am saying (and research that I cited appears to agree) is that crappy drivers come from "good" companies *and* "bad" ones and it sounds as if one has to have a considerable amount of driver evaluation expertise to be able to tell what they can load on a CQ system. I'm trying to determine how you implement what you claim to be immunity to the types of driver issues that plague other software applications. I'm wondering how that's done in an environment where a driver can disrupt a data structure in the kernel that s**ts (too good a Freudian slip to elide!) er . . . *shuts* down the systems.

What I've said before I'll say again. Bad drivers can come from good companies. It depends on deadlines, access to needed interoperation HW and SW, competence of the programmers, complexity of the driver, etc. However, it's clear from your previous responses that you intend CQ to be installed, configured and maintained by professionals. That means *they* get to worry about a client wanting to plug a new Panasonic Netcam into their Ethernet.

I've got to find a better way to express what I am trying to say. How does an end user of typical smarts know whether a new, inexpensive model of a Panasonic Netcam contains a good driver or a crappy driver. I know the typical way: He installs it and it either crashes or it doesn't. If it didn't crash, it doesn't really mean it's a good driver, BTW, it just means it didn't encounter any exceptions. This time.

Are you calling Panasonic a "junk" company or are you just impugning my buying habits in general? (-: I happen to own a lot of very high end equipment. It's usually when anything inferior just won't do. Browning is a favorite choice of mine because it's so reliable. So is Nikon because a picture is only as good as its lens. Sony is, too, because very little was able to match the quality and performance of something like the D8 DAT recorder.

You've been talking about how Windows can be made reliable when sequestered deeply enough. I've been implying that it's getting harder and harder to do that. I find it more and more necessary to connect to the web for upgrades to SW and FW and having to deal with Windows installation breaking ANY time I upgrade any hardware. In other words, I find your recommendations of isolation to achieve reliability to be increasingly less practical in the real world.

What happens when I plug it into my CQ server? Will it break the whole thing down? Will I be able to easily add a mass-marketed device like a netcam from a major market player to my CQ HA server or do I need to wait until you get around to coding for it? Please explain! I'll send you mine to evaluate if you're not sure how to answer the question.

That's not true. I was one of the first people here to work with the XTB. Unfortunately, I am an early adopter of some types of technology. I'm going to buy a Rozetta, just as I was one of the first people here to use Control Linc Maxi's from Smarthome to control all my house and unit codes from a single device. I just bought three brand new items. A multiple channel timer with voice labeling, a timer that's got a remote pager and an new X-10 Pro LCD based mini-timer for X-10 that runs for a long time on two AA's as backup.

The first two are brand new on the market and I got them for my wife mostly, but for me too as a way to monitor the end of Ebay auctions. Although they are stand alone, I'm going to integrate them into my HA "messaging" speakers that were inspired by a design I borrowed from John Warner, IIRC. So, anyway, your "seldom be the first" may apply to a lot of people, but not to me. You might be confused simply because I refuse to abandon the known demons of X-10 for the unknowns of some of the more recent protocols. That's me Scot's blood, laddy!

I think that once Vista hits, with its very different way of doing things, your already incredible workload will double and you won't be able to keep up. It's a pattern so common it's sad, really. I've seen it, closeup, at least ten times, maybe more. Those are the spectacular ones I can recall, where someone mortgaged his home to propel the software business. This goes back to the days of 386MAX, built and marketed just blocks from where I worked 20 years ago. When MS put EMM's inside the OS, the bottom fell out for third party memory managers. Can you really afford to split your current level of resourcing when Vista arrives?

Now *that's* a bulletproofing technique that I can understand but as you point out, there are both benefits and costs to your way of doing things. The most important is that it puts a lot more work on your shoulders and increases the risk of "keymanning" yourself - being the only one who really understands how everything works and the underlying design philosophy. What happens to CQ if, God forbid, you became incapacitated for six months?

As long as you have installers and professional configuration people do perform those tests and to ghost the machine before any new HW or SW is installed, that's fine. But as I said to Marc, there are ideal business practices and there's what people do in the real world. In the real world, they unwrap the new toy, plug the disk in and follow the instructions. Out of the many people whom I've taught some PC skills to, only one religiously backs up her machine before any changes are made. And that's only because she's encountered the "killer application" several times before. That's the new piece of HW or SW that causes the dreaded BSOD.

That's why I took issue here. While what you describe are excellent business practices, it's just not how most *lay* people work. Despite decades of trying to get end users to pay attention, I'm lucky if someone scribbles the actual words of an error message to me in an email when they have a problem. You've made it clear, though, that CQ is really aimed at an HA professional with a lot of data and networking skills. They'll probably back up a server before modifying it because their time is money! And they'll know what devices should and shouldn't be connected.

-- Bobby G.

Reply to
Robert Green

Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.