California Monkey Wrench
-
Dale
- Publisher / Author
- Posts: 259
- Joined: Wed Aug 25, 2004 4:59 pm
California Monkey Wrench
<em>I had planned on bringing you the details of the new Mitsubishi line tonight but this story (see below) broke today and it is important enough to preempt the product discussions. I am working on an expanded version of this press release which will be posted on our site tomorrow. The reason that this story is important is not because the consumer electronics industry cannot in the end meet the demands, but that the trade offs for doing so can backfire in several ways. First, the people needing these final transition boxes may be unhappy with the performance of lower power schemes and, secondly, the clock it ticking. It is hard to redesign and bring to market in time to accommodate the February 17, 2009 deadline for bringing an end to analog broadcasting. So, tomorrow, all things willing, I will bring you both my comments and research done on this story and then a discussion of the products from Mitsubishi.</em> _Dale Cripps
[url=http://www.hdtvmagazine.com/articles/2006/04/c_alifornia_mon.php]Read the Full Article[/url]
[url=http://www.hdtvmagazine.com/articles/2006/04/c_alifornia_mon.php]Read the Full Article[/url]
Last edited by Dale on Wed May 17, 2006 2:00 pm, edited 1 time in total.
-
hislonv
- New Member
- Posts: 4
- Joined: Tue Oct 19, 2004 3:23 pm
California
I guess they are bored with all of the rain we are getting here. All they will do is open up the door to ebay and mail order ways for getting these devices. The consumers won't care if they meet energy requirements or not and the government won't be able to stop it.
The amount of energy that they would save in a year is insignificant compared to the amount that is wasted by them concocting such laws or the analog stations all broadcasting their redundant signals.
I haven't seen the whole article, but how much energy can these things use? I know they don't exist so it isn't easy to answer, but someone must have an idea how much added energy use will exist when they are turned on in addition to the TV set.
Energy Star. Aren't they the ones that made it possible for my computer to remain partly on all the time burning up energy when I think that it is off?
Next...
The amount of energy that they would save in a year is insignificant compared to the amount that is wasted by them concocting such laws or the analog stations all broadcasting their redundant signals.
I haven't seen the whole article, but how much energy can these things use? I know they don't exist so it isn't easy to answer, but someone must have an idea how much added energy use will exist when they are turned on in addition to the TV set.
Energy Star. Aren't they the ones that made it possible for my computer to remain partly on all the time burning up energy when I think that it is off?
Next...
-
Bob Mankin
- Member
- Posts: 24
- Joined: Fri Mar 24, 2006 11:21 am
Bold headline. Except these regulations have been on the books since 2004 and the latest actions on the part of the California Energy Commission that I see were to delay the effective dates by 6 months. Did the CEA just wake up from a long nap?
The article clearly wants to instill fear while being short on facts. What specific requirement is the CEA having issue with? 3 years is a looooooong time to respond to a power supply redesign issue, IME and IMO.
BTW, it's projected in a Federal study that digital STBs will account for 4% of US power consumption by 2010. That's not an insignificant number, hence the reason for power consumption standards on both the Federal and State levels.
From what I know about the issue, it has nothing to do with the tuner performance of the boxes, but rather the power used during standby mode which makes up 75% of the time the DTV adapter is plugged into the wall. Claiming it will somehow result in poorer performance is not accurate.
The article clearly wants to instill fear while being short on facts. What specific requirement is the CEA having issue with? 3 years is a looooooong time to respond to a power supply redesign issue, IME and IMO.
BTW, it's projected in a Federal study that digital STBs will account for 4% of US power consumption by 2010. That's not an insignificant number, hence the reason for power consumption standards on both the Federal and State levels.
From what I know about the issue, it has nothing to do with the tuner performance of the boxes, but rather the power used during standby mode which makes up 75% of the time the DTV adapter is plugged into the wall. Claiming it will somehow result in poorer performance is not accurate.
-
Roger Halstead
- Major Contributor

- Posts: 210
- Joined: Sun Feb 26, 2006 4:13 pm
First, it's not unusual for some aspects of a regulation to go un-noticed for many months.
Next; 4% for all digital set top boxes (DSTB)so I assume the way it's worded that means *all* including my satellite TV receiver, cable decoder, and anything associated with the addition of HDTV. Knowing stats they are probably calling the DVR, The DVD recorder/player, and VHS recorders STBs as well. To me we need to separate out the digital to analog (D/A) converters which is what we are talking about here and I think that is going to be a small % of the STBs.
But, let's look at power. It's should be safe to assume that a simple D/A converter it going to take considerably less power than say my satellite receiver which draws a maximum of 35 watts and typically runs at half that. If the D/A converter is full featured (IOW) it downloads menus and other information as does my satellite receiver then it needs to remain on most of the time. If I leave the satellite receiver unplugged for more than a few minutes it has to reboot when it's plugged in and that can take 3 to 5 minutes.
As to that 4% power figure we need to stop and think a bit. The typically new TV draws on the order of 150 to 200 watts with the old, large CRTs drawing much more. I have some large CRT computer monitors that draw that much. So if we figure a continuous power use of say 20 watts when the peak is 35 watts and the typical TV draws 10 times that much then the TVs are going to be drawing 40% of the power off the grid plus the 4% of the STB? That seems a bit unrealistic to me.
BTW, when my computers are off they are off. Nothing is running and they take a battery to keep the BIOS settings. OTOH they are typically running 24 X 7.
Next; 4% for all digital set top boxes (DSTB)so I assume the way it's worded that means *all* including my satellite TV receiver, cable decoder, and anything associated with the addition of HDTV. Knowing stats they are probably calling the DVR, The DVD recorder/player, and VHS recorders STBs as well. To me we need to separate out the digital to analog (D/A) converters which is what we are talking about here and I think that is going to be a small % of the STBs.
But, let's look at power. It's should be safe to assume that a simple D/A converter it going to take considerably less power than say my satellite receiver which draws a maximum of 35 watts and typically runs at half that. If the D/A converter is full featured (IOW) it downloads menus and other information as does my satellite receiver then it needs to remain on most of the time. If I leave the satellite receiver unplugged for more than a few minutes it has to reboot when it's plugged in and that can take 3 to 5 minutes.
As to that 4% power figure we need to stop and think a bit. The typically new TV draws on the order of 150 to 200 watts with the old, large CRTs drawing much more. I have some large CRT computer monitors that draw that much. So if we figure a continuous power use of say 20 watts when the peak is 35 watts and the typical TV draws 10 times that much then the TVs are going to be drawing 40% of the power off the grid plus the 4% of the STB? That seems a bit unrealistic to me.
BTW, when my computers are off they are off. Nothing is running and they take a battery to keep the BIOS settings. OTOH they are typically running 24 X 7.
-
Bob Mankin
- Member
- Posts: 24
- Joined: Fri Mar 24, 2006 11:21 am
Roger, I'll let you read the piece for yourself. In the first page or two they define what "STB" was for this particular study.
http://www.iea.org/textbase/papers/2004/am_stb.pdf
Without seeing more on what specifically the CEA is complaining about, it's hard to determine if the complaint has any merit or is just a veiled attempt to allow sloppy STB designs into the market. I'm all for designing more power efficient, savvy code equipped hardware. I think there is too much "throw it over the fence" mentality when it comes to what the consumer electronics manufacturers put out for product these days.
It should also be noted that countries like the UK and Australia are addressing these very same issues today. You can find some of their legislative efforts with a Google search.
http://www.iea.org/textbase/papers/2004/am_stb.pdf
Without seeing more on what specifically the CEA is complaining about, it's hard to determine if the complaint has any merit or is just a veiled attempt to allow sloppy STB designs into the market. I'm all for designing more power efficient, savvy code equipped hardware. I think there is too much "throw it over the fence" mentality when it comes to what the consumer electronics manufacturers put out for product these days.
It should also be noted that countries like the UK and Australia are addressing these very same issues today. You can find some of their legislative efforts with a Google search.
-
Bob Mankin
- Member
- Posts: 24
- Joined: Fri Mar 24, 2006 11:21 am
A sidebar to these proposed regs, Motorola made some comments in response to solicitation for such by the Legislature and their only objection seemed to be with the wording that required testing at both 115V and 230V. Maybe they saw an unintentional requirement for dual power supplies? Later revisions of the proposal(Feb. '06) seemed to address their concerns.
I find that sorta interesting because IME Motorola is not exactly known for efficient power budgeting with their products. My Comcast box(Moto 6412) is known to have marginal cooling issues and resulting performance problems. If you design these products from the get-go with better power management and budgeting some of these problems will go away. It's that penny-wise, pound foolish thing biting them on the butt.
I find that sorta interesting because IME Motorola is not exactly known for efficient power budgeting with their products. My Comcast box(Moto 6412) is known to have marginal cooling issues and resulting performance problems. If you design these products from the get-go with better power management and budgeting some of these problems will go away. It's that penny-wise, pound foolish thing biting them on the butt.
-
robmxb
- New Member
- Posts: 3
- Joined: Wed Apr 12, 2006 6:29 pm
Virtually all DVB-T COFDM receivers sold in the UK can meet the California energy usage requirements today. The CEA tried to sabatoge the California energy requirements in Congress in the recent hearings on the DTV transition and the need for a converter box.Bob Mankin wrote:Roger, I'll let you read the piece for yourself. In the first page or two they define what "STB" was for this particular study.
http://www.iea.org/textbase/papers/2004/am_stb.pdf
Without seeing more on what specifically the CEA is complaining about, it's hard to determine if the complaint has any merit or is just a veiled attempt to allow sloppy STB designs into the market. I'm all for designing more power efficient, savvy code equipped hardware. I think there is too much "throw it over the fence" mentality when it comes to what the consumer electronics manufacturers put out for product these days.
It should also be noted that countries like the UK and Australia are addressing these very same issues today. You can find some of their legislative efforts with a Google search.
The CEA attempt is not a veiled attempt, more like a blatant attack on efficiency at the bequest of some of thier members who would rather sell us junk receivers for the US junk modulation, 8-VSB.
The main cause of delay in the US digital transition is our lousy 8-VSB modulaltion. Where was the CEA when 8-VSB was being chosen and later when it was challenged as being inadequate?
The UK has now sold over 11 million enegy efficient COFDM STB's in the last three years and sales are now accelerating. In January, the slowest sales month of the year for STB's they were selling at a 70,000 a week clip. In the US that would translate to 420,000 per week or almost 22 million a year. And the UK has no mandate. UK citizens freely and enthusiastically buy COFMD receivers because they work plug and play. They are not foisted on them with a mandate that saddles 75% of the public with 8-VSB receivers they did not want and do not need.
The CEA has got it all wrong from the beginning for the US public and for their members.
-
Dale
- Publisher / Author
- Posts: 259
- Joined: Wed Aug 25, 2004 4:59 pm
I am always puzzled by those who say that COFDM is the reason other nations are doing so well in SDTV and we are doing so poorly in HDTV when our HDTV market is the most successful in the history of consumer electronics. Sure it got off to a bumpy start. While there are glitches to this very day the pain of a difficult start is well behind us. Ask the retailers. They are showing record years in video sales due entirely to H/DTV.
Your comment that the COFDM DVB-T box meets the California spec of 1 watt standby and 8 watts active is correct. It should be added that the DVB-T does not decode 19.2 Mb/s MPG 2 signals nor does it have the GEMSTAR program guide we manage in standby.
I do think that had we the luxury of 7 and 8 MHz channels the COFDM choice might have been attractive. But we operate, and were mandate to live, in a 6 MHz channel. I was there when COFDM was first brought forward at the IBC in Amsterdam and knew all of the U.S. team who were assigned the task of evaluating it. Some of those people have said in later years that given the wider bandwidth, such as in Europe, it would have been a good choice, but none have said that 8-VSB was a decidedly bad choice, though all were disappointed in the early iterations of the hardware. It has its trade offs and in the U.S. environment, where reach is a stated importance, it serves...how well
Your comment that the COFDM DVB-T box meets the California spec of 1 watt standby and 8 watts active is correct. It should be added that the DVB-T does not decode 19.2 Mb/s MPG 2 signals nor does it have the GEMSTAR program guide we manage in standby.
I do think that had we the luxury of 7 and 8 MHz channels the COFDM choice might have been attractive. But we operate, and were mandate to live, in a 6 MHz channel. I was there when COFDM was first brought forward at the IBC in Amsterdam and knew all of the U.S. team who were assigned the task of evaluating it. Some of those people have said in later years that given the wider bandwidth, such as in Europe, it would have been a good choice, but none have said that 8-VSB was a decidedly bad choice, though all were disappointed in the early iterations of the hardware. It has its trade offs and in the U.S. environment, where reach is a stated importance, it serves...how well
-
Bob Mankin
- Member
- Posts: 24
- Joined: Fri Mar 24, 2006 11:21 am
The COFDM vs. 8-VSB argument is decided and dead. Bob, I give you the persistance award, but that dog won't hunt. We won't be switching to COFDM in the US. Period. End of story.
I totally disagree with the idea of a subsidy for power supply design! If a vendor wants to compete in this space, then pony up the R&D and play the game like all the rest. Forget the gov't handout idea. Are these companies expecting handouts on the front end also planning to hand over their profits on the back end when they get to market? I'm guessing no. If you don't believe in capitalism and the inherant risk taking, then sit on the sidelines and let someone else do it! But don't sit their and whine that you can't make enough money off the deal, because I can assure you that someone else will.
We're not talking designing a space shuttle here. 3 years to get it done is plenty of time, IMO and IME. The fact that only selected vendors are complaining about the timetable should tell you something. This is not a "one time to market" thing and if they are using this excuse as justification for some subsidy, they should be slapped!! 5 years is the average useable life expectancy on electronics. Future generation product will be needed. Using taxpayer dollars to fund R&D now that the Corporation purely profits from later is flawed in so many ways it makes my blood boil. The $200 M "admin" fee is precisely the sort of welfare situation that the program was sure to start when subsidy was even brought up. Another case of the gov't doing it all wrong. If they had let the private sector figure it out, I can assure you that 50 million STB units would get someone's attention enough that a power efficient design that still made the company a profit would be forthcoming. The gov't should set the standard with reasonable input from the private sector and then GET THE HELL OUT OF THE WAY.
Just another demonstration of how the lobbyists manage to screw it all up and in the end the taxpayer simply pays for Corporate greed.
I totally disagree with the idea of a subsidy for power supply design! If a vendor wants to compete in this space, then pony up the R&D and play the game like all the rest. Forget the gov't handout idea. Are these companies expecting handouts on the front end also planning to hand over their profits on the back end when they get to market? I'm guessing no. If you don't believe in capitalism and the inherant risk taking, then sit on the sidelines and let someone else do it! But don't sit their and whine that you can't make enough money off the deal, because I can assure you that someone else will.
We're not talking designing a space shuttle here. 3 years to get it done is plenty of time, IMO and IME. The fact that only selected vendors are complaining about the timetable should tell you something. This is not a "one time to market" thing and if they are using this excuse as justification for some subsidy, they should be slapped!! 5 years is the average useable life expectancy on electronics. Future generation product will be needed. Using taxpayer dollars to fund R&D now that the Corporation purely profits from later is flawed in so many ways it makes my blood boil. The $200 M "admin" fee is precisely the sort of welfare situation that the program was sure to start when subsidy was even brought up. Another case of the gov't doing it all wrong. If they had let the private sector figure it out, I can assure you that 50 million STB units would get someone's attention enough that a power efficient design that still made the company a profit would be forthcoming. The gov't should set the standard with reasonable input from the private sector and then GET THE HELL OUT OF THE WAY.
Just another demonstration of how the lobbyists manage to screw it all up and in the end the taxpayer simply pays for Corporate greed.
-
Dale
- Publisher / Author
- Posts: 259
- Joined: Wed Aug 25, 2004 4:59 pm
Here is the problem with applying free market thinking throughout this particular situation:
WHAT IF THERE IS NO MARKET?.
THAT IS WHY A SUBSIDY IS IN READINESS.
The subsidy is there as a reserve in the very likely event that there is no market demand for the products that need to be installed on existing analog sets still dependant upon over-the-air signals in order that the shut off of analog frequencies can occur without severe political "noise" or repercussion.
As far as being competitors, these companies compete all day long every day of the week with all kinds of products that we want. To label them as non-competitive seems short sighted.
The problem we anticipate is in the last phase of the transition in the form of a non-responsiveness from the poorer markets. There is a market segment of over-the-air television viewers who neither care anything about, nor do they want digital television services at any price. This segment will not, and often cannot, do anything about the transition on their own using their own very limited funds. So they will NOT demand (the first cause of a free market) any part of this final outfitting unaided.
How do you gear up a bunch of competitors for such a non lucrative market? You don
WHAT IF THERE IS NO MARKET?.
THAT IS WHY A SUBSIDY IS IN READINESS.
The subsidy is there as a reserve in the very likely event that there is no market demand for the products that need to be installed on existing analog sets still dependant upon over-the-air signals in order that the shut off of analog frequencies can occur without severe political "noise" or repercussion.
As far as being competitors, these companies compete all day long every day of the week with all kinds of products that we want. To label them as non-competitive seems short sighted.
The problem we anticipate is in the last phase of the transition in the form of a non-responsiveness from the poorer markets. There is a market segment of over-the-air television viewers who neither care anything about, nor do they want digital television services at any price. This segment will not, and often cannot, do anything about the transition on their own using their own very limited funds. So they will NOT demand (the first cause of a free market) any part of this final outfitting unaided.
How do you gear up a bunch of competitors for such a non lucrative market? You don