black line

the wireless messaging news

black line

Wireless News Aggregation

black line

Friday — March 7, 2014 — Issue No. 596

black line

Paging and Wireless Messaging Home Page image Newsletter Archive image Carrier Directory image Recommended Products and Services
Reference Papers Consulting Glossary of Terms Send an e-mail to Brad Dye

black line

Dear Friends of Wireless Messaging,

This is a special, mostly- one-topic issue of the newsletter. The main topic is:

Quantum Computing

black line

I have spent most of my life trying to understand enlightenment, now I am trying to understand an equally confusing term, entanglement. No really — quantum entanglement, and quantum superposition.

Now please don't chicken-out on me and say, "Oh I'm not technical. I think I'll skip this issue."

This is a really important topic and it is generating a lot of chatter in the technical community. TIME magazine even had a cover-story in their Feb. 17, 2014 issue about Quantum Computers. [article follows]

You don't have to understand all of this material — I certainly do not — but if you want to be well–informed about something that might "leap-frog" computing ahead by a giant step, then please continue reading.

There are two sides to this topic. One side maintains that:

  • Several of D–Wave's quantum computers have already been purchased (estimated $10 million each) by important people, companies, and government entities like:
    • Amazon founder Jeff Bezos;
    • In-Q-Tel (the CIA);
    • Lockheed Martin;
    • Google;
    • An un-named U.S. intelligence agency;
    • A NASA computing lab.
  • The D-Wave 2 computer in theory could perform 2 512 operations simultaneously. That's more calculations than there are atoms in the universe, by many orders of magnitude;
  • The D-Wave 2 computer is 3,600 times faster than a super computer.

The other side claims that:

  • Quantum computers are just "smoke and mirrors;"
  • D-wave's computers are not really quantum computers;
  • The results of scientific investigations so far have not supported most of D-Wave's claims.

So if someone asks you what you think about this, what will you say? Read on so you can form your own opinion. Both sides are reported below.

black line

9 Ways Quantum Computing Will Change Everything

Feb. 06, 2014

Computers built on the principles of quantum physics—as opposed to 'classical' physics—promise a revolution on the order of the invention of the microprocessor or the splitting of the atom. D-Wave, a small Canadian company backed by Jeff Bezos, NASA, and the CIA among others, is the first firm to sell a so-called quantum computer—at roughly $10 million a pop. The vast increase in power could revolutionize fields as disparate as medicine, space exploration, and artificial intelligence.

In this week's TIME cover story, Lev Grossman writes that D-Wave's machines are "so radical and strange, people are still trying to figure out what it's for and how to use it." We live in the age of Big Data, burying ourselves in ­­information—search queries, genomes, credit-card purchases, phone records, retail transactions, social media, geological surveys, climate data, surveillance videos, movie ­recommendations—and D-Wave just happens to be selling "a very shiny new shovel."

The company has plenty of critics, some of whom claim its machines aren't quantum computers at all. And yet, the technology could herald radical changes for the following areas, to name a few:

1. Safer airplanes —Lockheed Martin plans to use its D-Wave to test jet software that is currently too complex for classical computers.

2. Discover distant planets —Quantum computers will be able to analyze the vast amount of data collected by telescopes and seek out Earth-like planets.

3. Win elections —Campaigners will comb through reams of marketing information to best exploit individual voter preferences.

4. Boost GDP —Hyper-personalized advertising, based on quantum computation, will stimulate consumer spending.

5. Detect cancer earlier —Computational models will help determine how diseases develop.

6. Help automobiles drive themselves —Google is using a quantum computer to design software that can distinguish cars from landmarks.

7. Reduce weather-related deaths —Precision forecasting will give people more time to take cover.

8. Cut back on travel time —Sophisticated analysis of traffic patterns in the air and on the ground will forestall bottlenecks and snarls.

9. Develop more effective drugs —By mapping amino acids, for example, or analyzing DNA-sequencing data, doctors will discover and design superior drug-based treatments.

Wireless Messaging News
  • Emergency Radio Communications
  • Wireless Messaging
  • Critical Messaging
  • Telemetry
  • Paging
  • Wi-Fi
wireless logo medium

black line

About Us

A new issue of the Wireless Messaging Newsletter is posted on the web each week. A notification goes out by e-mail to subscribers on most Fridays around noon central US time. The notification message has a link to the actual newsletter on the web. That way it doesn't fill up your incoming e-mail account.

There is no charge for subscription and there are no membership restrictions. Readers are a very select group of wireless industry professionals, and include the senior managers of many of the world's major Paging and Wireless Messaging companies. There is an even mix of operations managers, marketing people, and engineers — so I try to include items of interest to all three groups. It's all about staying up-to-date with business trends and technology.

I regularly get readers' comments, so this newsletter has become a community forum for the Paging, and Wireless Messaging communities. You are welcome to contribute your ideas and opinions. Unless otherwise requested, all correspondence addressed to me is subject to publication in the newsletter and on my web site. I am very careful to protect the anonymity of those who request it.

I spend the whole week searching the Internet for news that I think may be of interest to you — so you won't have to. This newsletter is an aggregator — a service that aggregates news from other news sources. You can help our community by sharing any interesting news that you find.

black line

Editorial Policy

Editorial Opinion pieces present only the opinions of the author. They do not necessarily reflect the views of any of advertisers or supporters. This newsletter is independent of any trade association.

black line

Back To Paging


Still The Most Reliable Protocol For Wireless Messaging!

black line


* required field

If you would like to subscribe to the newsletter just fill in the blanks in the form above, and then click on the “Subscribe” bar.

free There is no charge for subscription and there are no membership restrictions. It's all about staying up-to-date with business trends and technology.


black line


Newsletter Advertising

advertise here

If you are reading this, your potential customers are probably reading it as well. Please click here to find out how.

black line

Can You Help The Newsletter?

left arrow

You can help support the Wireless Messaging News by clicking on the PayPal Donate button above. It is not necessary to be a member of PayPal to use this service.

Voluntary Reader Support

Newspapers generally cost 75¢ $1.50 a copy and they hardly ever mention paging or wireless messaging. If you receive some benefit from this publication maybe you would like to help support it financially? A donation of $50.00 would certainly help cover a one-year paid subscription. If you are wiling and able, please click on the PayPal Donate button above.

black line

black line

Valid CSS!

black line

black line

black line

Advertiser Index

American Messaging
Critical Alert Systems
Critical Response Systems
Eagle Telecom
Easy Solutions
Hahntech USA
Hark Technologies
Ira Wiesenfeld & Associates
Leavitt Communications
Preferred Wireless
Prism Paging
Product Support Services — (PSSI)
Paging & Wireless Network Planners LLC — (Ron Mercer)
WiPath Communications

black line

Quantum Computing: A Primer

Source: TIME magazine

black line


“The Quantum Quest for a Revolutionary Computer”

Quantum computing uses strange subatomic behavior to exponentially speed up processing. It could be a revolution, or it could be wishful thinking

By Lev Grossman
TIME magazine
Monday, Feb. 17, 2014

For years astronomers have believed that the coldest place in the universe is a massive gas cloud 5,000 light-years from Earth called the Boomerang Nebula, where the temperature hovers at around -458°F, just a whisker above absolute zero. But as it turns out, the scientists have been off by about 5,000 light-years. The coldest place in the universe is actually in a small city directly east of Vancouver called Burnaby.

Burnaby is the headquarters of a computer firm called D-Wave. Its flagship product, the D-Wave Two, of which there are five in existence, is a black box 10 ft. high. Inside is a cylindrical cooling apparatus containing a niobium computer chip that's been chilled to around 20 millikelvins, which, in case you're not used to measuring temperature in millikelvins, is about -459.6°F, almost 2° colder than the Boomerang Nebula. By comparison, interstellar space is about 80 times hotter.

The D-Wave Two is an unusual computer, and D-Wave is an unusual company. It's small, just 114 people, and its location puts it well outside the swim of Silicon Valley. But its investors include the storied Menlo Park, Calif., venture-capital firm Draper Fisher Jurvetson, which funded Skype and Tesla Motors. It's also backed by famously prescient Amazon founder Jeff Bezos and an outfit called In-Q-Tel, better known as the high-tech investment arm of the CIA. Likewise, D-Wave has very few customers, but they're blue-chip: they include the defense contractor Lockheed Martin; a computing lab that's hosted by NASA and largely funded by Google; and a U.S. intelligence agency that D-Wave executives decline to name.

The reason D-Wave has so few customers is that it makes a new type of computer called a quantum computer that's so radical and strange, people are still trying to figure out what it's for and how to use it. It could represent an enormous new source of computing power—it has the potential to solve problems that would take conventional computers centuries, with revolutionary consequences for fields ranging from cryptography to nanotechnology, pharmaceuticals to artificial intelligence.

That's the theory, anyway. Some critics, many of them bearing Ph.D.s and significant academic reputations, think D-Wave's machines aren't quantum computers at all. But D-Wave's customers buy them anyway, for around $10 million a pop, because if they're the real deal they could be the biggest leap forward since the invention of the microprocessor.

In a sense, quantum computing represents the marriage of two of the great scientific undertakings of the 20th century, quantum physics and digital computing. Quantum physics arose from the shortcomings of classical physics: although it had stood for centuries as definitive, by the turn of the 20th century it was painfully apparent that there are physical phenomena that classical physics fails dismally to explain. So brilliant physicists—including Max Planck and Albert Einstein—began working out a new set of rules to cover the exceptions, specifically to describe the action of subatomic particles like photons and electrons.

Those rules turned out to be very odd. They included principles like superposition, according to which a quantum system can be in more than one state at the same time and even more than one place at the same time. Uncertainty is another one: the more precisely we know the position of a particle, the less precisely we know how fast it's traveling—we can't know both at the same time. Einstein ultimately found quantum mechanics so monstrously counterintuitive that he rejected it as either wrong or profoundly incomplete. As he famously put it, "I cannot believe that God plays dice with the world."

The modern computing era began in the 1930s, with the work of Alan Turing, but it wasn't until the 1980s that the famously eccentric Nobel laureate Richard Feynman began kicking around questions like: What would happen if we built a computer that operated under quantum rules instead of classical ones? Could it be done? And if so, how? More important, would there be any point?

It quickly became apparent that the answer to that last one was yes. Regular computers (or classical computers, as quantum snobs call them) work with information in the form of bits. Each bit can be either a 1 or a 0 at any one time. The same is true of any arbitrarily large collection of classical bits; this is pretty much the foundation of information theory and digital computing, as we know them. Therefore, if you ask a classical computer a question, it has to proceed in an orderly, linear fashion to find an answer.

Now imagine a computer that operates under quantum rules. Thanks to the principle of superposition, its bits could be 1, or 0, or 1 and 0 at the same time.

In its superposed state, a quantum bit exists as two equally probable possibilities. According to one theory, at that moment it's operating in two slightly different universes at the same time, one in which it's 1, one in which it's 0; the physicist David Deutsch once described quantum computing as "the first technology that allows useful tasks to be performed in collaboration between parallel universes." Not only is this excitingly weird, it's also incredibly useful. If a single quantum bit (or as they're inevitably called, qubits, pronounced cubits) can be in two states at the same time, it can perform two calculations at the same time. Two quantum bits could perform four simultaneous calculations; three quantum bits could perform eight; and so on. The power grows exponentially.

The supercooled niobium chip at the heart of the D-Wave Two has 512 qubits and therefore could in theory perform 2 512 operations simultaneously. That's more calculations than there are atoms in the universe, by many orders of magnitude. "This is not just a quantitative change," says Colin Williams, D-Wave's director of business development and strategic partnerships, who has a Ph.D. in artificial intelligence and once worked as Stephen Hawking's research assistant at Cambridge. "The kind of physical effects that our machine has access to are simply not available to supercomputers, no matter how big you make them. We're tapping into the fabric of reality in a fundamentally new way, to make a kind of computer that the world has never seen."

Naturally, a lot of people want one. This is the age of Big Data, and we're burying ourselves in information—search queries, genomes, credit-card purchases, phone records, retail transactions, social media, geological surveys, climate data, surveillance videos, movie recommendations—and D-Wave just happens to be selling a very shiny new shovel. "Who knows what hedge-fund managers would do with one of these and the black-swan event that that might entail?" says Steve Jurvetson, one of the managing directors of Draper Fisher Jurvetson. "For many of the computational traders, it's an arms race."

One of the documents leaked by Edward Snowden, published last month, revealed that the NSA has an $80 million quantum-computing project suggestively code-named Penetrating Hard Targets. Here's why: much of the encryption used online is based on the fact that it can take conventional computers years to find the factors of a number that is the product of two large primes. A quantum computer could do it so fast that it would render a lot of encryption obsolete overnight. You can see why the NSA would take an interest.

But while the theory behind quantum computing is reasonably clear, the actual practice is turning out to be damnably difficult. For one thing, there are sharp limits to what we know how to do with a quantum computer. Cryptography and the simulation of quantum systems are currently the most promising applications, but in many ways quantum computers are still a solution looking for the right problem. For another, they're really hard to build. To be maximally effective, qubits have to exhibit quantum behavior, not just superposition but also entanglement (when the quantum states of two or more particles become linked to one another) and quantum tunneling (just Google it). But they can do that only if they're effectively isolated from their environment—no vibrations, no electromagnetism, no heat. No information can escape: any interaction with the outer world could cause errors to creep into the calculations. This is made even harder by the fact that while they're in their isolated state, you still have to be able to control them. There are many schools of thought on how to build a qubit—D-Wave makes its in the form of niobium loops, which become superconductive at ultra-low temperatures—but all quantum-computing endeavors struggle with this problem.

Since the mid-1990s, scientists have been assembling and entangling systems of a few quantum bits each, but progress has been slow. In 2010 a lab at the University of Innsbruck in Austria announced the completion of the world's first system of 14 entangled qubits. Christopher Monroe at the University of Maryland and the Joint Quantum Institute has created a 20-qubit system, which may be the world's record. Unless, of course, you're counting D-Wave.

D-Wave's co-founder and chief technology officer is a 42-year-old Canadian named Geordie Rose with big bushy eyebrows, a solid build and a genial but slightly pugnacious air—he was a competitive wrestler in college. In 1998 Rose was finishing up a Ph.D. in physics at the University of British Columbia, but he couldn't see a future for himself in academia. After taking a class on entrepreneurship, Rose identified quantum computing as a promising business opportunity. Not that he had any more of a clue than anybody else about how to build a quantum computer, but he did have a hell of a lot of self- confidence. "When you're young you feel invincible, like you can do anything," Rose says. "Like, if only those bozos would do it the way that you think, then the world would be fine. There was a little bit of that." Rose started D-Wave in 1999 with a $4,000 check from his entrepreneurship professor.

For its first five years, the company existed as a think tank focused on research. Draper Fisher Jurvetson got onboard in 2003, viewing the business as a very sexy but very long shot. "I would put it in the same bucket as SpaceX and Tesla Motors," Jurvetson says, "where even the CEO Elon Musk will tell you that failure was the most likely outcome." By then Rose was ready to go from thinking about quantum computers to trying to build them—"we switched from a patent, IP, science aggregator to an engineering company," he says. Rose wasn't interested in expensive, fragile laboratory experiments; he wanted to build machines big enough to handle significant computing tasks and cheap and robust enough to be manufactured commercially. With that in mind, he and his colleagues made an important and still controversial decision.

Up until then, most quantum computers followed something called the gate-model approach, which is roughly analogous to the way conventional computers work, if you substitute qubits for transistors. But one of the things Rose had figured out in those early years was that building a gate-model quantum computer of any useful size just number a gate-model quantum computer has succeeded in factorizing is 21. (That isn't very hard: the factors are 1, even weirder and harder to explain.

An adiabatic quantum computer works by means of a process called quantum annealing. Its heart is a network of qubits linked together by couplings. You "program" the couplings with an algorithm that specifies certain interactions between the qubits—if this one is a 1, then that one has to be a 0, and so on. You put the qubits into a state of quantum superposition, in which they're free to explore all those 2-to-the-whatever computational possibilities simultaneously, then you allow them to settle back into a classical state and become regular 1's and 0's again. The qubits naturally seek out the lowest possible energy state consistent with the requirements you specified in your algorithm back at the very beginning. If you set it up properly, you can read your answer in the qubits' final configuration.

If that's too abstract, the usual way quantum annealing is explained is by an analogy with finding the lowest point in a mountainous landscape. A classical computer would do it like a solitary walker who slowly wandered over the whole landscape, checking the elevations at each point, one by one. A quantum computer could send multiple walkers at once swarming out across the mountains, who would then all report back at the same time. In its ability to pluck a single answer from a roiling sea of possibilities in one swift gesture, a quantum computer is not unlike a human brain.

Once Rose and D-Wave had committed to the adiabatic model, they proceeded with dispatch. In 2007 D-Wave publicly demonstrated a 16-qubit adiabatic quantum computer. By 2011 it had built (and sold to Lockheed Martin) the D-Wave One, with 128 qubits. In 2013 it unveiled the 512-qubit D-Wave Two. They've been doubling the number of qubits every year, and they plan to stick to that pace while at the same time increasing the connectivity between the qubits. "It's just a matter of years before this capability becomes so powerful that anyone who does any kind of computing is going to have to take a very close look at it," says Vern Brownell, D-Wave's CEO, who earlier in his career was chief technology officer at Goldman Sachs. "We're on that cusp right now."

But we're not there yet. Adiabatic quantum computing may be technically simpler than the gate-model kind, but it comes with trade-offs. An adiabatic quantum computer can really solve only one class of problems, called discrete combinatorial optimization problems, which involve finding the best—the shortest, or the fastest, or the cheapest, or the most efficient—way of doing a given task. This narrows the scope of what you can do considerably.

For example, you can't as yet perform the kind of cryptographic wizardry the NSA was interested in, because an adiabatic quantum computer won't run the right algorithm. It's a special-purpose tool. "You take your general- purpose chip," Rose says, "and you do a bunch of inefficient stuff that generates megawatts of heat and takes forever, and you can get the answer out of it. But this thing, with a picowatt and a microsecond, does the same thing. So it's just doing something very specific, very fast, very efficiently."

This is great if you have a really hard discrete combinatorial optimization problem to solve. Not everybody does. But once you start looking for optimization problems, or at least problems that can be twisted around to look like optimization problems, you find them all over the place: in software design, tumor treatments, logistical planning, the stock market, airline schedules, the search for Earth-like planets in other solar systems, and in particular in machine learning.

Google and NASA, along with the Universities Space Research Association, jointly run something called the Quantum Artificial Intelligence Laboratory, or QuAIL, based at NASA Ames, which is the proud owner of a D-Wave Two. "If you're trying to do planning and scheduling for how you navigate the Curiosity rover on Mars or how you schedule the activities of astronauts on the station, these are clearly problems where a quantum computer—a computer that can optimally solve optimization problems—would be useful," says Rupak Biswas, deputy director of the Exploration Technology Directorate at NASA Ames. Google has been using its D-Wave to, among other things, write software that helps Google Glass tell the difference between when you're blinking and when you're winking.

Lockheed Martin turned out to have some optimization problems too. It produces a colossal amount of computer code, all of which has to be verified and validated for all possible scenarios, lest your F-35 spontaneously decide to reboot itself in midair. "It's very difficult to exhaustively test all of the possible conditions that can occur in the life of a system," says Ray Johnson, Lockheed Martin's chief technology officer. "Because of the ability to handle multiple conditions at one time through superposition, you're able to much more rapidly—orders of magnitude more rapidly—exhaustively test the conditions in that software." The company re-upped for a D-Wave Two last year.

Another challenge rose and company face is that there is a small but nonzero number of academic physicists and computer scientists who think that they are partly or completely full of sh-t. Ever since D-Wave's first demo in 2007, snide humor, polite skepticism, impolite skepticism and outright debunkings have been lobbed at the company from any number of ivory towers. "There are many who in Round 1 of this started trash-talking D-Wave before they'd ever met the company," Jurvetson says. "Just the mere notion that someone is going to be building and shipping a quantum computer—they said, 'They are lying, and it's smoke and mirrors.'"

Seven years and many demos and papers later, the company isn't any less controversial. Any blog post or news story about D-Wave instantly grows a shaggy beard of vehement comments, both pro- and anti-. The critics argue that D-Wave is insufficiently transparent, that it overhypes and underperforms, that its adiabatic approach is unpromising, that its machines are no faster than classical computers and that the qubits in those machines aren't even exhibiting quantum behavior at all—they're not qubits, they're just plain old bits, and Google and the media have been sold a bill of goods. "In quantum computing, we have to be careful what we mean by 'utilizing quantum effects,'" says Monroe, the University of Maryland scientist, who's among the doubters. "This generally means that we are able to store superpositions of information in such a way that the system retains its 'fuzziness,' or quantum coherence, so that it can perform tasks that are impossible otherwise. And by that token there is no evidence that the D-Wave machine is utilizing quantum effects."

One of the closest observers of the controversy has been Scott Aaronson, an associate professor at MIT and the author of a highly influential quantum-computing blog. He remains, at best, cautious. "I'm convinced ... that interesting quantum effects are probably present in D-Wave's devices," he wrote in an e-mail. "But I'm not convinced that those effects, right now, are playing any causal role in solving any problems faster than we could solve them with a classical computer. Nor do I think there's any good argument that D-Wave's current approach, scaled up, will lead to such a speedup in the future. It might, but there's currently no good reason to think so."

Not only is it hard for laymen to understand the arguments in play, it's hard to understand why there even is an argument. Either D-Wave has unlocked fathomless oceans of computing power or it hasn't— right? But it's not that simple. D-Wave's hardware isn't powerful enough or well enough understood to show serious quantum speedup yet, and you can't just open the hood and watch the qubits do whatever they're doing. There isn't even an agreed-upon method for benchmarking a quantum computer. Last May a professor at Amherst College published the results of a bake-off she ran between a D-Wave and a conventional computer, and she concluded that the D-Wave had performed 3,600 times faster. This figure was instantly and widely quoted as evidence of D-Wave's triumph and equally instantly and widely denounced as meaningless.

Last month a team including Matthias Troyer, an internationally respected professor of computational physics at ETH Zurich, attempted to clarify things with a report based on an extensive series of tests pitting Google's D-Wave Two against classical computers solving randomly chosen problems. Verdict? To quote from the study: "We find no evidence of quantum speedup when the entire data set is considered and obtain inconclusive results when comparing subsets of instances on an instance-by-instance basis." This has, not surprisingly, generally been interpreted as a conspicuous failure for D-Wave.

But where quantum computing is concerned, there always seems to be room for disagreement. Hartmut Neven, the director of engineering who runs Google's quantum-computing project, argues that the tests weren't a failure at all— that in one class of problem, the D-Wave Two outperformed the classical computers in a way that suggests quantum effects were in play. "There you see essentially what we were after," he says. "There you see an exponentially widening gap between simulated annealing and quantum annealing ... That's great news, but so far nobody has paid attention to it." Meanwhile, two other papers published in January make the case that a) D-Wave's chip does demonstrate entanglement and b) the test used the wrong kind of problem and was therefore meaningless anyway. For now pretty much everybody at least agrees that it's impressive that a chip as radically new as D-Wave's could even achieve parity with conventional hardware.

The attitude in D-Wave's C-suite toward all this back-and-forth is, unsurprisingly, dismissive. "The people that really understand what we're doing aren't skeptical," says Brownell. Rose is equally calm about it; all that wrestling must have left him with a thick skin. "Unfortunately," he says, "like all discourse on the Internet, it tends to be driven by a small number of people that are both vocal and not necessarily the most informed." He's content to let the products prove themselves, or not. "It's fine," he says. "It's good. Science progresses by rocking the ship. Things like this are a necessary component of forward progress."

Are D-Wave's machines quantum computers? Fortunately this is one of those scenarios where an answer will in fact become apparent at some point in the next five or so years, as D-Wave punches out a couple more generations of computers and better benchmarking techniques evolve and we either do see a significant quantum speedup or we don't.

The company has a lot of ground to cover between now and then, not just in hardware but on the software side too. Generations of programmers have had decades to create a rich software ecosystem around classical microprocessors in order to wring the maximum possible amount of usefulness out of them. But an adiabatic quantum computer is a totally new proposition. "You just don't program them the way you program other things," says William Macready, D-Wave's VP of software engineering. "It's not about writing recipes or procedures. It's more about kind of describing, what does it mean to be an answer? And doing that in the right way and letting the hardware figure it out."

For now the answer is itself suspended, aptly enough, in a state of superposition, somewhere between yes and no. If the machines can do anything like what D-Wave is predicting, they won't leave many fields untouched. "I think we'll look back on the first time a quantum computer outperformed classical computing as a historic milestone," Brownell says, "It's a little grand, but we're kind of like Intel and Microsoft in 1977, at the dawn of a new computing era."

But D-Wave won't have the field to itself forever. IBM has its own quantum-computing group; Microsoft has two. There are dozens of academic laboratories busily pushing the envelope, all in pursuit of the computational equivalent of splitting the atom. While he's got only 20 qubits now, Monroe points out that the trends are good: that's up from two bits 20 years ago and four bits 10 years ago. "Soon we will cross the boundary where there is no way to model what's happening using regular computers," he says, "and that will be exciting."

Source: TIME magazine
Source: TIME magazine

black line

Ivy Corp Eagle Telecom

black line



black line

Critical Response Systems

black line

More than Paging.
First Responder Solutions.

Our patented technology notifies clinical personnel immediately, while tracking who receives and responds to each alarm. Users confirm or defer each event with a single button press, and analytic dashboards display response statistics in real time, as well as historically broken down by time, unit, room, and individual.

Our systems not only notify your personnel quickly and reliably, but also provide actionable feedback to fine-tune your procedures, reduce unnecessary alarms, and improve patient outcomes.

black line

black line

We have new rate plans! Check the out — our unlimited voice, text & data rate plans start at just $39/month! Or get unlimited voice and text for just $29/month. And when you share Solavei with family and friends — they’ll thank you for the savings, and you save an additional $5/month for every person you switch to Solavei. Switch to Solavei today — ask me how!

Solavei unlimited plans start at $29/mo

We’re excited to introduce our new unlimited rate plans, with nationwide voice, text & data starting at $39/month. Or get unlimited voice and text for just $29/month. You can also save even more when you share Solavei with others.

allison dye

For more information contact me at or go to:

Allison Dye (Kornberger)
Telephone: 918-814-8142
Tulsa, Oklahoma

This is a commercial message from Solavei, LLC

black line

Quantum Computer

qubits A quantum computer is a computation device that makes direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from digital computers based on transistors. Wikipedia

Related topics
A qubit is a quantum bit , the counterpart in quantum computing to the binary digit or bit of classical computing.

Quantum entanglement is the key to quantum computing, cryptography, and numerous other real-world applications of quantum mechanics.

In quantum computing, a quantum algorithm is an algorithm which runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation. Wikipedia


black line


Specialists in sales and service of equipment from these leading manufacturers, as well as other two-way radio and paging products:

UNICATIONbendix king

motorola blue Motorola SOLUTIONS

COMmotorola red Motorola MOBILITY spacer
Philip C. Leavitt
Leavitt Communications
7508 N. Red Ledge Drive
Paradise Valley, AZ 85253
Web Site:
Mobile phone:847-494-0000
Skype ID:pcleavitt

black line

Quantum computing and the real: Ontological implications

Posted by: Bert Olivier
Posted on: February 27, 2014

In an engrossing (pun intended) article in the most recent TIME magazine (February 17), Lev Grossman wrote about the "Infinity Machine – Quantum Leap" (pp. 28-35). A revolutionary new kind of computer is introduced to those who are willing to expand their minds in an effort to understand it. Admittedly, this is a mainstream magazine, and allowances are made for accessibility, but it is still mind-challenging stuff — when has the theoretical domain of quantum mechanics and its applications NOT been challenging?

Some people may have heard of Schrödinger's Cat — a thought experiment that illustrates what things are like in the "quantum universe", with its puzzling "superposition" ( TIME, p. 30). Imagine a cat being put in a sealed box with a container of poison and a source of radiation. There is a 50/50 chance of radioactive particles escaping from the source, and if that happens, the poison-container would shatter, and the released poison would cause the cat to die. In terms of quantum mechanics laws, the cat is "in superposition", that is, alive and dead at the same time (which is unthinkable in everyday reality, although zombie movies come to mind), until the moment that the box is opened and it is "observed", which settles the matter either way under conditions of a "classical state" (physics).

This is a picturesque way of saying what Werner Heisenberg, one of the other major figures in quantum mechanics, meant when he remarked, around the middle of the 20th century, that the very fact of "observing" the world "changes it". In other words, the world, as it is independently of human beings observing it, is unimaginably different from the way it appears when we do perceive it by means of our sensory faculty, or senses, and could be — like Schrödinger's Cat — different, widely divergent things at the same time. (Apart from quantum mechanics, this is the terrain of what is today known as complexity theory.) This is so counter-intuitive that I would not blame readers to stop reading this post immediately. But carry on if you want to know what computing has to do with it.

In the article concerned, Grossman introduces one to a Canadian computer firm called D-Wave, which produces computers of which the D-Wave Two is the "flagship". There are only five of these at present, partly because they are impossibly expensive — about US$10 million each — and even more importantly, they only operate under extremely cold conditions; so cold, that it is a whisker from absolute zero, to wit -273.1 degrees Centigrade. This is the temperature required by the niobium chip inside the cooling cylinder. Just for the record, this is about one degree colder than what was believed to be the coldest known place in universe, Boomerang Nebula, about 5000 light years from Earth.

So what is so special about these computers, developed by Canadian physicist Geordie Rose and his company? To put it in terms simple enough for myself to understand, they differ from ordinary, "classical" computers in important ways. A "classical" computer works with "bits" (single units of information) in a binary, linear fashion — where every bit works according to the logic of either 1 or 0. The computers we know operate like this, and no matter whether you have a "supercomputer" or not, it cannot avoid working its way through masses of information along the lines of this either/or logic.

By contrast, a quantum computer operates with "qubits" according to the quantum logic of "both/and". The data of ordinary, "classical" computers are assumed to exist in relatively stable, singular or unitary states, and they process these data linearly, or one by one, however fast. But quantum computers are built to work with information existing in "multiple states", and must therefore, correspondingly, be able to perform multiple operations simultaneously, that is, not one by one in linear format.

It is not surprising that people have drawn connections between quantum theory and parallel universes — after all, if a quantum bit (qubit), in "superposed" condition, represents two or more possibilities that are equally probable, those possibilities exist, "ontologically" speaking (that is, regarding their mode of being), in different universes. For computer people, as well as for NASA, for the CIA and for the NSA (among others), the technical advantage of multiple quantum calculations is that they can be performed at the same time.

Incidentally, in mathematical terms this was the theme of Daryn Aronofsky's early film, a very original neo-noir called Pi, in which the noir detective was a mathematician looking for the formula "for everything", and was predictably hounded by Wall Street types, as well as by religious fanatics — there's a resemblance between these two groups — who thought they could use this "absolute" formula for failsafe investments and for finding God, respectively. Not surprisingly, therefore, Grossman lists stock trading as one of the areas where quantum computers can solve problems that "normal" computers cannot.

The other fields where Q-computers are in demand include surveillance practices (such as those associated with the NSA), medicine, software design and problem-solving of all kinds, such as the most time-economising route among several destinations. If they work, they can probably solve problems — at least abstractly — that are far more complex than those found in these fields. The D-Wave Two is equipped with 512 qubits, and could therefore theoretically perform 2 to the 512th power operations at the same time. As Grossman reminds one, this represents more calculations than there are atoms in the physical universe.

What interests me about this, is the way that the quantum-theoretical underpinnings of these computing developments validate Immanuel Kant's notion of the Ding-an-sich or "thing-in-itself", which is the "noumenal" reality "behind" the "phenomenal" reality of the things we ordinarily know in space and time. Kant postulated this realm of noumena because he argued that what we know is determined in its appearance and intelligibility by our own rational faculties — sensorily and intellectually — and what these phenomena are like, outside of our knowledge of them, must forever remain a mystery: they have multiple possibilities of being. The resemblance between this and what quantum logic points to must be clear in ontological terms. (Heisenberg's remark, referred to earlier, about humans changing what is there, in the world, just by observing it, also resonates with Kant's claims.)

In the case of the ontology or theory of being suggested by Jacques Lacan's theory of the subject — which has a lot in common with Kant's ontology and epistemology or theory of knowledge — the implications are even more intriguing. Lacan presents the subject as being precariously articulated between three registers or "orders" — the real, the imaginary and the symbolic. The imaginary is the register of the ego, because of the "image" with which you identify; the symbolic is that of language as discourse (essential for the social bond), and the most puzzling of the three, the "real", points to the register of what surpasses language, what cannot be said, even when we struggle to, as when we have been traumatised, and we keep on trying to say what happened to us, but cannot do it adequately.

Unlike Kant's Ding-an-sich, which stands "behind" the phenomenon in space and time, the "real" is not "behind" everything that is named in language. Rather, it shows itself within language itself, as an "internal limit" (as Joan Copjec points out in Imagine There is No Woman), as "not-being-able-to-name-it" or say "it", or something, conclusively. We always say more, produce more words, but we can never say everything we want to in our quest for understanding the world. What Lacan's "real" therefore suggests even more powerfully than Kant's "thing-in-itself", is a multidimensional, "virtual" (in the Scholastic sense of "potential") ontological realm that is "right there", yet inaccessible, somewhere beyond the language in terms of which we express the way we understand our world. This resonates almost audibly with the quantum universe of multiple, co-existing possibilities. Besides, unless one postulates the "real" in this way as a kind of ontological "matrix", it is difficult to account for truly novel historical events.

Source: Thought Leader

black line

black line

American Messaging

black line


black line

American Messaging

black line

black line

Easy Solutions

black line

easy solutions

Easy Solutions provides cost effective computer and wireless solutions at affordable prices. We can help in most any situation with your communications systems. We have many years of experience and a vast network of resources to support the industry, your system and an ever changing completive landscape.

  • We treat our customers like family. We don't just fix problems...
    • We recommend and implement better cost effective solutions.
  • We are not just another vendor — We are a part of your team.
    • All the advantages of high priced full time employment without the cost.
  • We are not in the Technical Services business...
    • We are in the Customer Satisfaction business.

Experts in Paging Infrastructure
Glenayre, Motorola, Unipage, etc.
Excellent Service Contracts
Full Service—Beyond Factory Support
Contracts for Glenayre and other Systems starting at $100
Making systems More Reliable and MORE PROFITABLE for over 28 years.

Please see our web site for exciting solutions designed specifically for the Wireless Industry. We also maintain a diagnostic lab and provide important repair and replacement parts services for Motorola and Glenayre equipment. Call or e-mail us for more information.

Easy Solutions
3220 San Simeon Way
Plano, Texas 75023

Vaughan Bowden
Telephone: 972-898-1119

black line

Easy Solutions

black line

black line

Product Support Services, Inc.

black line


Repair and Refurbishment Services

pssi logo


Product Support Services, Inc.

511 South Royal Lane
Coppell, Texas 75019
(972) 462-3970 Ext. 261 left arrow left arrow

PSSI is the industry leader in reverse logistics, our services include depot repair, product returns management, RMA and RTV management, product audit, test, refurbishment, re-kitting and value recovery.

black line

Classified Advertising

Want to BuyFor Sale
Vocom 350 Watt UHF amplifiers
Giles Smith
GCS Electronics & Communications
QT-250 B high-band transmitter with an analogue exciter and instruction book. Don't really need the rack. Looking for something to run a couple hundred watts on the 2 meter ham band.
John Parmalee
Hark Verifier or a Hark Verifier II and Icom IC PCR 100 receiver.
Steve Suker
CVC Paging
If you have any equipment that you would like to buy or sell, please send me an e-mail and I will include it in the classified section above. If a sale is made I ask the seller to send me a 10% commission, much the same as the voluntary payments that are requested on the Internet for shareware. There is no cost to the buyer. This is on the honor system — no contracts — just the Internet equivalent of a hand shake.

black line

Leavitt Communications

black line

its stil here

It's still here — the tried and true Motorola Alphamate 250. Now owned, supported, and available from Leavitt Communications. Call us for new or reconditioned units, parts, manuals, and repairs.

We also offer refurbished Alphamate 250's, Alphamate IIs, the original Alphamate and new and refurbished pagers, pager repairs, pager parts and accessories. We are FULL SERVICE in Paging!

E-mail Phil Leavitt ( ) for pricing and delivery information or for a list of other available paging and two-way related equipment.

black line

Phil Leavitt

leavitt logo

7508 N. Red Ledge Drive
Paradise Valley, AZ 85253

black line

D-Wave's Quantum Computing Claim Disputed Again

By Jeremy Hsu
Posted 10 Feb 2014 | 20:01 GMT

Photo: D-Wave Systems

The strongest scientific evidence for D-Wave 's claim to have built commercial quantum computers just got weaker. A new paper finds that classical computing can explain the performance patterns of D-Wave's machines just as well as quantum computing can—a result that undermines crucial support for D-Wave's claim from a previous study.

Quantum computing offers the possibility of doing many calculations in parallel by using quantum bits that can exist as both a 1 and 0 at the same time, as opposed to classical computing bits that exist as only a 1 or 0. Such mind-bending quantum physics could allow quantum computers to outperform classical computers in tackling challenging problems that would take today's hardware practically forever to solve. But D-Wave's quantum computing claims remain controversial in the scientific community even as the Canadian company has attracted high-profile clients such as Lockheed Martin and Google . (See IEEE Spectrum 's overview of the evidence for and against D-Wave's machines from the December 2013 issue.)

A growing number of independent researchers have examined the Canadian company's claim to have built quantum computers at scales well beyond anything seen in academic labs. Rather than build machines based on the traditional logic-gate model of computing, D-Wave built 512-qubit machines that supposedly perform quantum annealing—a method for tackling optimization problems. The best optimization solutions represent the lowest "valley" of a problem landscape resembling peaks and valleys.

One key to making quantum computing practical involves harnessing the quantum physics phenomenon of entanglement—separate qubits sharing the same quantum state—so that quantum computers can scale up effectively to tackle more complex challenges. A previous paper used simulations to suggest D-Wave's machine performance patterns did show evidence of large-scale quantum entanglement across more than one hundred qubits. But the new paper published on the arXiv preprint server came up with a classical computing model that could explain the same patterns formerly attributed to quantum annealing, according to the Physics arXiv Blog .

The new paper drew a critical response from Geordie Rose, founder and chief technology officer of D-Wave, in which he suggested the new paper's classical model did not fit other experiments involving the performance of D-Wave machines. But, Scott Aaronson , a theoretical computer scientist at MIT and D-Wave critic, said the classical model's applicability in this particular case still undermines the biggest piece of evidence for D-Wave machines' having large-scale entanglement. He spoke recently with Matthias Troyer, a computational physicist at ETH Zurich and a lead author of the previous paper (Boixo et al. ) about how the new research (Shin et al.) had changed Troyer's mind.

"Most tellingly, Matthias Troyer—one of the lead authors of the Boixo et al. paper, and someone I trust as possibly the most evenhanded authority in this entire business—tells me that the Shin et al. paper caused him to change his views: he didn't know if the correlation patterns could be reproduced in any reasonable model without entanglement, and now he knows that it's possible."

When IEEE Spectrum spoke with Aaronson last year, he was willing to admit that D-Wave's machines may do quantum entanglement despite his overall skepticism toward D-Wave's endeavors. Even now, he noted that evidence still exists for D-Wave's machines performing small-scale entanglement. But he added that the evidence from the past year shows that D-Wave still has not demonstrated a speedup over classical computing. That is, D-Wave hasn't proved its machines can outperform classical computers on increasingly challenging problems.

The lack of evidence for such a speedup has not stopped Google from enthusiastically testing out its recently purchased D-Wave machine, but it does mean D-Wave still has a ways to go before it can show that quantum computing has truly arrived. In a previous IEEE Spectrum interview, Troyer pointed out how independent researchers have only gained access to D-Wave machines in the past few years—a change that has allowed them to begin testing the machines' scientific merits. The results of such scientific investigations so far have not supported most of D-Wave's claims.

[D-Wave's response to these allegations follows below.]

Source: IEEE Spectrum

black line

black line

Consulting Alliance

black line

Brad Dye, Ron Mercer, Allan Angus, Vic Jackson, and Ira Wiesenfeld are friends and colleagues who work both together and independently, on wireline and wireless communications projects. Click here left arrow for a summary of their qualifications and experience. Each one has unique abilities. We would be happy to help you with a project, and maybe save you some time and money.

black line

Consulting Alliance

black line




black line


black line


Telemetry solution

Easy Application & Better Performance


NPCS Telemetry Modem


(ReFLEX 2.7.5)






black line


black line


black line

Preferred Wireless

black line

preferred logo

Terminals & Controllers:
5ASC1500 Parts: ATC, Memory Cards & Power Supplies    
3CNET Platinum Controllers 
2GL3100 RF Director 
1GL3000 ES — 2 Chassis
40SkyData 8466 B Receivers
1GL3000L Complete w/Spares
3Zetron 2200 Terminals
1Unipage—Many Unipage Cards & Chassis
9Zetron M66 Transmitter Controllers  
4Glenayre Universal Exciters, 1 UHF, 3 VHF
5Hot Standby Panel—2 Old Style, 3 New Style
25New and Used Cabinets & Open Racks 
38Andrews PG1N0F-0093-810 Antennas 928-944 MHz, Omni, 10dBi, 8 Degree Down-Tilt
4Andrews PG1D0F-0093-610 Antennas 928-944 MHz, Omni, 10dBi, 6 Degree Down Tilt
Link Transmitters:
1QT-5701, 35W, UHF, Link Transmitter
4Glenayre QT4201 & 6201, 25 & 100W Midband Link TX
1Glenayre QT6994, 150W, 900 MHz Link TX
3Motorola 10W, 900 MHz Link TX (C35JZB6106)
2Eagle 900 MHz Link Transmitters, 60 & 80W
8Glenayre GL C2100 Link Repeaters
2Motorola Q2630A, 30W, UHF Link TX
VHF Paging Transmitters
1Glenayre QT7505
1Glenayre QT8505
UHF Paging Transmitters:
20Glenayre UHF GLT5340, 125W, DSP Exciter
900 MHz Paging Transmitters:
2Glenayre GLT8200, 25W
15Glenayre GLT-8500 250W
3Glenayre GLT 8600, 500W
40Motorola Nucleus 900 MHz 300W CNET Transmitters


Too Much To List • Call or E-Mail

Rick McMichael
Preferred Wireless, Inc.
10658 St. Charles Rock Rd.
St. Louis, MO 63074
888-429-4171 or 314-429-3000 left arrow

black line

Preferred Wireless

black line

black line

critical alert CA Partner’s Program

Providing better communications solutions to hospitals across the country — together!

For CAS, strong partnerships remain key to providing our software-based communications solutions to our customers. These solutions include:

ca dr and nurse
nurse call systemscritical messaging solutionsmobile health applications

We provide the communication, training and resources required to become a CA partner. In turn, our partners provide customers with the highest levels of local service & support. CA Partners may come from any number of business sectors, including:

  • Service Providers
  • System Integrators
  • Value Added Resellers and Distributors
  • Expert Contractors
If you would like to hear more about our CA Partners program, we’d love to hear from you.

black line

Selected portions of the BloostonLaw Telecom Update, and/or the BloostonLaw Private Users Update — newsletters from the Law Offices of Blooston, Mordkofsky, Dickens, Duffy & Prendergast, LLP are reproduced in this section with the firm's permission.

black line

BloostonLaw Telecom Update Vol. 17, No. 9 March 5, 2014

Final Reminder – Broadband Experiment Expressions of Interest Due March 7, 2014

The FCC's March 7, 2014 date for the filing of non-binding Expressions of Interest (EOIs) in conducting experiments in price cap and rate of return areas is nearly here. BloostonLaw has prepared an EOI template, which we can further tailor with company-specific information. While EOIs may continue to be filed after the deadline, clients interested in submitting EOIs should nevertheless contact the firm without delay.


black line

Effective Date for IP Transition Order and Comment Deadline for FNPRMs Set

The final rule portion of the FCC's January 31, 2014 IP Transition Order and the FNPRMs on rural broadband experiments and numbering research appeared in the Federal Register on February 28, 2014. This sets both the effective date of the final rules (except for § 54.313(e)(1) through (3), which contain new or modified information collection requirements that will not be effective until approved by the Office of Management and Budget) and the comment deadline for the rural broadband experiments and the numbering research as March 31, 2014. Reply comments for the rural broadband experiments and the numbering research are due April 14, 2014.

Topics for comment regarding the rural broadband experiments include:

  • Budget: Because annual disbursements from the Connect America Fund to date have been less than the total $4.5 billion available and funds have accumulated in the reserve account, the FCC proposes that a limited amount of these unallocated funds be made available for experiments in any part of the country, whether served by an incumbent price cap carrier or rate-of-return carrier.
  • Application Process: The FCC proposes generally to apply the same application process and procedures adopted in the Order for the Connect America Phase II experiment to the experiments in rate-of-return areas, recognizing that it may be appropriate to adopt an implementation schedule different than that used in price cap territories.
  • Selective Criteria: The FCC proposes that (i) cost effectiveness; (ii) the extent to which the applicant proposes to build robust, scalable networks; (iii) the extent to which applicants propose innovative strategies to leverage non-Federal governmental sources of funding, such as State, local, or Tribal government funding; and (iv) whether applicants propose to offer high-capacity connectivity to Tribal lands be selective criteria. The FCC further asks for comment on how to score the final criteria.
  • Additional Criteria: The FCC also seeks comment on specific measures to implement its objective to focus on areas where end users lack Internet access that delivers 3 mbps downstream and 7 kbps upstream. For example, the FCC asks what specific numerical measure should be used to determine whether the extent of competitive overlap is de minimis, and what measures should be taken to ensure that federal funds are focused on unserved areas.

Topics for comment regarding numbering research include appropriate budgeting and funding. For example, the FCC asks whether it should use numbering contributions associated with telephone numbering management that are used to fund the operation of numbering databases and services; what types of awards would be appropriate; and whether it should seek input from the North American Numbering Council (NANC).

Clients interested in filing comments on either FNPRM should contact the firm without delay.

FCC Announces Rural Broadband Workshop

The FCC has announced a workshop dedicated to, "an examination of the broadband needs of rural populations and the unique challenges of both broadband deployment and adoption in rural areas." The workshop will be held on Wednesday, March 19, 2014, in the FCC Meeting Room (TW-C305), 445 12th Street, S.W., Washington, D.C., 20554, and will also be streamed live at .

According to the news release, the discussion will also highlight the economic, educational, and healthcare benefits that can be realized through broadband deployment and adoption; examine different business models that have been used to deploy broadband in rural area, including a discussion of the factors that drive investment decisions and technology choices of different types of providers in rural communities; and examine the role that states have played, and can continue to play, in meeting the rural broadband challenge.

Additional details concerning the workshop agenda and panelists will be forthcoming.

AT&T IP Service Transition Trials Proposed in Florida and Alabama

Although AT&T has promoted the concept of IP trials, it has sought approval to conduct two rather limited trials for full IP transition in wire centers in rural Carbon Hill, Alabama and suburban Kings Point, Florida. (General dockets 12-353 and 13-5) Comments on the trials are due March 31 and reply comments are due on April 10. AT&T states that its objectives are to 1) identify and resolve operational, technical, logistic and other issues that could arise when existing TDM-based networks are discontinued; 2) help AT&T develop and implement processes for migrating customers off TDM networks and services; 3) ensure that customers, manufacturers and other stakeholders have "sufficient education and notice regarding the impending transition so that they also have the opportunity to prepare for the time when TDM networks and services no longer are available"; 4) "come out of the trials with an actionable plan that we can utilize to continue this transition in our approximately 4,700 wire centers and across the country in order to meet our stated goal of completing the IP Transition by the end of 2020."

In the trial wire centers, AT&T proposes to offer subscribers a choice of either U-verse Voice-over-IP (for those subscribers already within AT&T's existing footprint for this service), Wireless Home Phone, Wireless Home Phone and Internet with 4G LTE Broadband service, or 4G LTE. AT&T appears to contemplate that some customers will continue to have no option other than POTS.

It appears that Wireless Home Phone and Wireless Home Phone and Internet with 4G LTE is similar to and has the same problems as Verizon's Voice Link service, which was proposed for parts of New York and New Jersey affected by Superstorm Sandy. According to AT&T, Wireless Home Phone and Wireless Home Phone and Internet with 4G LTE, which are CMRS services, "comply with the Commission's existing 911 requirements for CMRS, and do not provide E-911 with street address." The services also do not "currently support alarm monitoring, medical alert and credit card validation applications." Although AT&T states that it is currently "developing enhancements that will provide all of these applications" it files as confidential when it plans to introduce these enhancements. AT&T states that it will not "seek to grandfather its TDM-based voice services until these enhancements are available." The services also are not compatible with fax machines and dial-up Internet service, which, apparently, will not be rectified.

In the Carbon Hill wire center proposed for the trial, AT&T proposes to offer 96 % of living units (which include residential and business locations, occupied or unoccupied and those under construction) a choice of either U-verse Voice-over-IP, a wireline IP service, Wireless Home Phone, or 4G LTE. AT&T states that it "has not yet found a viable replacement service for the remaining four percent of locations, and it is still considering it options for those living units." Although AT&T submitted its filing as confidential and redacted the percentage of subscribers that would be offered the various services, TRDaily has reported that approximately 41% of subscribers would have a choice of either U-verse Voice-over-IP or Wireless Home Phone and 55% would only have a wireless 4G LTE option.

In the Kings Point, Florida, wire center selected for the trial, identified as a wire center in the West Palm Beach metropolitan area, AT&T will offer wireless broadband services to 100% of living units. AT&T redacts the percentage of living units that will have wireline broadband service, which, apparently, will be less than 100%. It appears that AT&T selected this wire center for a trial because more than 70% of the population is over 50 years of age. TRDaily also reports that a significant number of households are in a community where the homeowner's association has an exclusive deal for video and broadband services with Comcast Corp.

AT&T also states that some of its existing DSL customers reside at locations that cannot be reached by its U-verse of IP DSLAM network, and some of these customers in Carbon Hill and Kings Point will only have access to a wireless broadband service.

With respect to wholesale services, AT&T states that non-affiliated carriers currently are purchasing wholesale services in both test wire centers. AT&T states that it has identified the available replacement products to legacy TDM services used by wholesale customers and it "will provide customers who choose to do so the opportunity to transition to those alternatives in this initial phase of the trial." AT&T states that it will continue to meets its wholesale obligations under Section 251(c) of the Act, "including making UNEs available through the current state of the trial." Wholesale customers also "will have the opportunity to obtain bare copper loops and utilize their own electronics to provide high capacity services to their end user customers." AT&T states that it is developing IP replacement services, "which it intends to make available for resale to wholesale customers on commercial terms."

With respect to interconnection, AT&T states that the status quo will be maintained. However, AT&T states that the exchange of traffic for customers subscribing to IP replacement service will entail differences in call routing, which could mean a change in routing through AT&T's access tandem instead of direct end office trunking that had been established at the AT&T ILEC central office for call terminating to AT&T's TDM customers. AT&T claims that there should be no material cost impact on interconnecting carriers attributable to this phase of the trials, and that as the transition proceeds, "carriers also will likely experience cost savings as they eliminate existing direct end office trunking arrangements that no longer would be necessary to reach TDM customers."

With respect to intercarrier compensation, AT&T states that it will maintain the status quo, including the transition to bill- and-keep. AT&T further states that VoIP and Wireless Home Phone services "are and will remain subject to the existing intercarrier compensation regimes for VoIP-PSTN or CMRS traffic, as appropriate." Therefore, to the extent AT&T's trial forces some customers to wireless service or they voluntarily switch to wireless service, "compensation for terminating calls to that customer would be the compensation regime applicable to CMRS, rather than the wireline compensation regime."

If the trials are approved, AT&T will move onto Phase 1 of the proposed trial, where AT&T would seek authority to limit new service orders to wireless and IP-based services. Existing POTS customers would be "grandfathered" and allowed to continue to receive their traditional POTS offering. In Phase 2, AT&T would seek authority to transition existing POTS customers off of that service.

Auction of H-Block Licenses Closes; DISH Network Big Winner

The FCC's auction of H block spectrum officially ended on Thursday afternoon, with total bids of $1.564 billion – exactly the reserve price – and just one bidder. DISH Network was confirmed as the winner for all 176 U.S. Economic Area (EA) licenses that were available for bidding in Auction 96. These funds will go toward the construction of the nationwide public safety broadband network known as FirstNet. According to bidding results available on the FCC's Integrated Spectrum Auction System, three most expensive licenses covered the New York City ($217 million), Los Angeles ($167 million) and Chicago ($96 million).

DISH can use the spectrum to launch its own wireless network, either on its own, or by partnering with an incumbent service provider(s). Alternatively, DISH could lease its spectrum to wireless carriers. Wireless devices will need to be capable of operating on the H-Block channels first, however, so it may be some time before the H-Block spectrum is actually in use.

The H Block is a 10 MHz block of paired airwaves that runs from 1915-1920 MHz (for the uplink) and from 1995-2000 MHz (for the downlink). DISH already controls 40 MHz spectrum adjacent to a portion of the H Block, known as the AWS-4 band, which runs from 2000-2020 MHz (for the uplink) and 2180-2200 MHz (for the downlink). Last year, however, DISH asked the FCC to allow it to use the 2000-2020 MHz band for downlink operations instead of uplink as a condition for agreeing to bid the reserve price. Sprint was also on record with the FCC as possibly wanting to bid for the H-Block in order to pair this spectrum with PCS G-Block to create a 2 x 10 MHz nationwide LTE network. DISH and Sprint have spent the past two years battling over power limits and interference protections that the FCC should adopt for the downlink portion of the H-Block, a dispute that was complicated by DISH chairman Charlie Ergen's last ditch attempt to acquire Sprint and its affiliate Clearwire. Sprint was successfully purchased by Japan's Softbank for $21.6 billion in a transaction that closed last fall.

In a prepared statement, FCC chairman Tom Wheeler said: "With this successful auction, the Commission makes good on its commitment to unleash more spectrum for consumers and businesses, delivering a significant down payment towards funding the nationwide interoperable public safety network. The H Block auction is a win for the American people, and we thank Chairwoman Clyburn for her leadership scheduling it. We also commend everyone who worked so hard to resolve technical issues that made this previously unusable spectrum valuable."

FCC Cancels Newsroom Study Entirely

An FCC spokesperson announced on Friday that the FCC will not move forward with the Critical Information Needs study. The FCC's spokesperson indicated that the FCC plans to reassess the best way to fulfill its obligation to Congress to identify barriers to entry into the communications marketplace faced by entrepreneurs and other small businesses (the stated original purpose of the study).

As we reported in the February 26 Edition of the BloostonLaw Telecom Update, FCC Commissioner Ajit Pai authored an opinion piece in the Wall Street Journal criticizing the FCC's study which, in part, would "send researchers to grill reporters, editors and station owners about how they decide which stories to run."

Of the decision, Commissioner Pai said, "I am pleased that the FCC has canceled its Critical Information Needs study. In our country, the government does not tell the people what information they need. Instead, news outlets and the American public decide that for themselves. I look forward to working with my colleagues to identify and remove actual barriers to entry into the communications industry. This newsroom study was a distraction from that important goal."

Law & Regulation

black line

Lifeline Notice of Apparent Liability: $8,300 Overpayment becomes Proposed $3.7 million Civil Fine

On February 28, 2014, the FCC issued a Notice of Apparent Liability for Forfeiture (NALF) in the amount of $3,719,900 against Budget PrePay, Inc. d/b/a Budget Mobile (Budget) for alleged violations of the Lifeline Rules.
According to the NALF, the FCC determined that Budget had apparently willfully and repeatedly violated Rule Sections 54.407, 54,409 and 54.410 "by requesting and/or receiving support from the Lifeline program … for ineligible subscriber lines for the months of February through April 3013." As determined by a USAC compliance audit ( i.e., an "in-depth data validation" or an "IDV"), the over-payments over the three-month period totaled $8,300 spread among 691 individual duplicate lines for which Budget improperly sought Lifeline reimbursement, as reflected on the twelve FCC Forms 497 that were submitted.

Budget is a Louisiana corporation designated as an ETC in the states of Arkansas, Kentucky, Louisiana, Maryland, Michigan, Nevada, Rhode Island, South Carolina and Wisconsin, and the USAC audit encompassed Budget's Lifeline activities in all of these states.

For the violations at issue, Section 503(b)(2)(B) of the Communications Act of 1934, as amended, authorizes the FCC to assess a forfeiture against a telecommunications carrier of up to $150,000 for each violation or each day of a continuing violation, up to a maximum of $1.5 million for a single act or failure to act. In calculating the proposed monetary forfeiture, the FCC stated as a general proposition that it "believes that the imposition of a significant forfeiture amount is a necessary response to Lifeline over-collection violations;" and that "imposing a significant forfeiture on such rule violators should deter those service providers that fail to devote sufficient resources to ferreting out company practices resulting in over-collection violations," and further stated that "a significant forfeiture should achieve broader industry compliance with Lifeline rules that are critically important to the effective functioning of the [Universal Service Fund]."

The FCC noted that it has implemented a three-part forfeiture framework for Lifeline over-collection violations that imposes: (1) a $20,000 base forfeiture for each instance in which an ETC files a Form 497 that includes ineligible subscribers in the line count; (2) a base forfeiture of $5,000 for each ineligible subscriber for whom the ETC requests and/or receives support from the fund in violation of the above-mentioned rule sections; and (3) an upward adjustment of the base forfeiture equal to three times the reimbursements requested and/or received by the ETC for ineligible subscribers. Application of these standards yielded the proposed $3,719,900 forfeiture.

FCC Proposes to Fine AT&T $25,000 for Installing Whip Antenna on Rooftop

The FCC has proposed to fine AT&T $25,000 for installing a 20-foot whip antenna on top of a building mounted antenna structure without first obtaining clearance from the FAA and an antenna structure registration from the FCC. While it is commonplace for telecom companies and private users of radio to install their antennas on rooftop structures, this case demonstrates that the mere mounting of an antenna can inadvertently cause an FCC violation, if the structure was already just below the height triggering FAA clearance and the antenna put it over.
The FCC became aware of the whip antenna in response to a complaint from the Los Angeles Police Department concerning an unlit tower. In this case, the overall vertical height of the building (including the unmarked and unlit antenna structure and whip antenna) was 208 feet. AT&T admitted that the whip antenna had been installed in November 2012 without receiving prior approval from the FAA.

AT&T's actions resulted in two violations: (a) failure to maintain proper marking and lighting since installation of the whip antenna would have required the tower and perhaps the building to have been lighted since the overall height exceeded 200 feet above ground and (b) failure to register the antenna structure since notice to the FAA was required.

While the FCC notes that the base amount of the proposed fine should be $13,000, it is proposing a $25,000 fine because the FCC wants to ensure that forfeiture liability is a deterrent and not simply a cost of doing business for large multi-billion dollar companies such as AT&T.

Any antenna structure must undergo FAA clearance and FCC Antenna Structure Registration (ASR) if its height exceeds 200 feet above ground level, or if it is close enough to an airport or heliport to violate the landing strip's "glide slope". Clients mounting their antennas on a rooftop must determine whether the antenna will increase the structure height above the limits, along with other regulatory compliance measures such as making sure their antenna will not cause the structure to violate RF radiation limits, or trigger warning sign/restricted access requirements.

FCC Proposes Over $1.9 Million In Penalties For Misuse Of Emergency Alert Warnings

Earlier this week, the FCC proposed fines against Viacom, ESPN, and NBC Universal in excess of $1.9 million for repeatedly transmitting an advertisement that misused the warning sounds of the nationwide Emergency Alert System (EAS) in March 2013.

The EAS system is a national public warning system that requires broadcasters, cable television operators, wireless cable operators, wireline video service providers, satellite digital audio radio service providers and direct broadcast satellite providers to make it possible for the President of the United States to address the American public in the event of an emergency. This system is also utilized by federal, state and local officials to deliver important emergency information such as Amber Alerts and weather information such as tornado warnings, flood watches and warnings, etc. that are targeted to specific geographic areas. Because of abuses involving the old Emergency Broadcast System – where alert tones had been simulated for other purposes, the FCC adopted strict rules in 1994 that prohibited the improper use or simulation of EAS Alert tones. This is to ensure that the EAS alert tones are taken seriously by the public and are truly able to alert the public to an emergency message that may require prompt or immediate attention. As a result, EAS codes or the Attention Signal may only be broadcast in the event of an emergency or to test the EAS system. The routine use of EAS alert tones for other purposes would relegate the alert tones to mere background noise – thus reducing the needed effect to alert the public to an emergency.

Like most cases involving EAS violations, the FCC's investigation was made in response to viewer or consumer complaints from the public concerning the broadcast of the Olympus has Fallen movie trailer on various networks. In response to Letters of Inquiry from the Enforcement Bureau, the Companies each admitted that the commercial appeared multiple times on multiple national and regional networks under their control, and that the commercial used actual EAS codes and the Attention Signal to advertise the film. The networks generally claimed in one fashion or another that they did not realize that broadcast of the movie trailer would violate the FCC's rules – and in each instant, the network had pre-screened the movie trailer to make sure it did not raise any "red flags" or otherwise violate their standards for broadcast. Likewise, the FCC did not accept the argument that the networks merely transmitted content that had been received from another source.

In this regard, it appears that the FCC's focus for enforcement was the various networks rather than the individual broadcasters, cable companies or MVPDs. [...] is the originator of the content – meaning where the content is placed into the broadcast stream – rather than the local cable operator or distributor for content that is broadcast over national and/or regional networks. For content originated on local cable or MVPD channels, the FCC would likely look to the local cable or MVPD provider. As a result, those of our clients who receive local content for broadcast should review it prior to broadcast in order to ensure that it does not contain any material, including EAS codes or the Attention Signal, which would violate the FCC's Rules.

[ Brad's note: MVPD is a multichannel video programming distributor — a service provider delivering video programming services.]

As a result of the investigation, the FCC issued an omnibus Notice of Apparent Liability for a total of $1,930,000 to the Companies. Seven Viacom-owned networks transmitted the advertisement a total of 108 times over five days, resulting in a proposed forfeiture of $1,120,000. Three ESPN-owned networks transmitted the advertisement a total of 13 times over four days, resulting in a proposed forfeiture of $280,000. Finally, seven NBC Universal-owned cable networks transmitted the advertisement a total of 38 times over a span of six days, resulting in a proposed forfeiture of $530,000.


black line

U.S. Sues Sprint Over Wiretapping Overcharges

A number of news sources are reporting that the government of the United States has filed a law suit against Sprint Corp. alleging that the company over-billed the FBI and other law enforcement agencies to the tune of $21 million for costs associated with assisting in court-ordered wiretaps.

The complaint states that Sprint "knowingly included in its intercept charges the costs of financing modifications to equipment, facilities, and services" associated maintaining compliance with the Communications Assistance in Law Enforcement Act (CALEA), the 1994 law that requires telecommunications companies to be capable of performing wiretaps for the government. While CALEA provides that companies may charge for "reasonable expenses" incurred by the actual performance of the wiretap, the government argues, but not for upgrading equipment to maintain compliance over time.

The complaint continues, "Because Sprint's invoices for intercept charges did not identify the particular expenses for which it sought reimbursement, federal law enforcement agencies were unable to detect that Sprint was requesting reimbursement of these unallowable costs" – an issue more commonly known as cramming.

Sprint denied the allegations. "Under the law, the government is required to reimburse Sprint for its reasonable costs incurred when assisting law enforcement agencies with electronic surveillance," said a spokesman for the company. "The invoices Sprint has submitted to the government fully comply with the law. We have fully cooperated with this investigation and intend to defend this matter vigorously."

Calendar At-A-Glance

black line



black line

Mar. 7 – Initial expressions of interest in rural broadband experiments are due.
Mar. 7
– Reply comments on NECA 2014 Average Schedule Formulas are due.
Mar. 10 – Oppositions to Petitions to Deny T-Mobile/Verizon Spectrum Sale are due.
Mar. 10 – Electronic filing deadline for Form 497 for carriers seeking support for the preceding month and wishing to receive reimbursement by month's end.
Mar. 11 – Replies to Oppositions to Petitions for Reconsideration on Rural Call Completion Order are due.
Mar. 17 – Reply comments are due on Use of Mobile Wireless Devices on Airborne Aircraft.
Mar. 17 –
Replies to Oppositions to Petitions to Deny T-Mobile/Verizon Spectrum Sale are due.
Mar. 31
– FCC Form 525 (Delayed Phase-down CETC Line Counts) is due.
Mar. 31 – FCC Form 508 (ICLS Projected Annual Common Line Requirement) is due.
Mar. 31 – Comments on FCC Process Reform Report are due.
Mar. 31 – Comments are due on Rural Broadband Experiments and Numbering Research.
Mar. 31 – Comments are due on AT&T Wire Center Trials Proposal.


black line

Apr. 1 – FCC Form 499-A (Telecommunications Reporting Worksheet) is due.
Apr. 1 – Annual Accessibility Certification is due.
Apr. 1 – PRA comments on Form 477 (Local Telephone Competition and Broadband Reporting) are due.
Apr. 1 – PRA comments on Form 477 (Local Telephone Competition and Broadband Reporting) are due.
Apr. 10 – Reply comments are due on AT&T Wire Center Trials Proposal.
Apr. 14 – Reply comments are due on Rural Broadband Experiments and Numbering Research.

BloostonLaw Private Users Update Vol. 15, No. 2 February 2014

FCC Acts to Help Emergency Responders Locate Wireless 911 Callers

The FCC has proposed rules that will help emergency first responders locate wireless callers to 911. These proposed rules will update the Commission's Enhanced 911 (E911) rules in order to respond to the increasing use of wireless phones to call 911 – especially from indoor locations where traditional wireline phones may not be available or otherwise used.

Under the FCC's current rules, wireless providers are required to automatically transmit location information to 911 call centers (PSAPs) within a certain degree of accuracy. The current rules were adopted in 1999 and last updated in 2010 and only allow wireless carriers to meet the accuracy standard based on the performance of outdoor wireless calls. Because many Americans have replaced their landline phones with wireless devices, the FCC's new rules would require the location accuracy to identify the building for most indoor calls.

In California, the FCC noted 73 percent of 911 calls originate from wireless phones and that 80 percent of all smart phone use is indoors. Additionally, in multi-story buildings such as office buildings and apartment buildings, first responders are frequently not able to easily locate the caller since they are unable to determine the floor or even the building from which the call originated.

In the near term, the FCC is proposing that wireless providers meet interim location accuracy metrics that would at least identify the building for most indoor calls and the floor from which the call originated. For the long-term, the FCC seeks to develop a more granular indoor location accuracy standard that would require the identification of a specific room, office, or apartment from which a 911 call is made.

The Commission has not yet set a comment cycle for its proposals.

FCC Issues Software Tool to Help Protect AM Stations from Interference by Other Towers

The Commission's Rules to provide a single protective scheme for the construction and modification of antenna towers near AM tower arrays became effective as of February 20, 2014. Under these rules, which were originally adopted in August, 2013, the FCC has designated the "Moment Method" computer modeling as the principal means for determining whether a nearby tower will affect an AM radiation pattern. Our clients constructing antenna structures, or modifying existing structures, must make sure they are not inadvertently modifying an AM station radiation pattern by doing so.

In order to facilitate compliance with these rules, the FCC has developed an AM Tower Tool that will allow parties proposing the construction or modification of towers to input their proposed location into the tool. The AM Tower Tool will then determine if there are nearby operating AM stations that could be affected by the proposed construction or modification of the tower. In addition to advising the proponent, the AM Tower Tool will also notify the proponents of proposed AM stations within the coordination distances that are authorized, but not yet operating. The FCC's AM Tower Tool can be found at .

Deadline for Signal Booster Compliance Extended to April 30, 2014

The FCC has extended, until April 30, 2014, the deadline for all consumer signal boosters that are marketed, sold or distributed in the United States to comply with the Commission's new technical standards for Consumer Signal Boosters. The 60-day delay, from March 1, 2014 until April 30, 2014, was necessary so that the FCC would have sufficient time to certify Consumer Signal Boosters under its new rules and thereby provide consumers with adequate choices among compliant Consumer Signal Boosters.

Regardless of this extension, consumers will be permitted to operate legacy signal booster equipment if (a) they have the consent of the wireless service provider whose signal is being extended by the booster, and (b) the signal booster is registered with that provider. Wireless service providers will be permitted to shut down any signal booster that causes harmful interference to their operations or harms network performance.

The FCC extended the deadline because unexpected complexities in its rules, coupled with the Government shut down in October, 2013, led to delays in finalizing test procedures for Consumer Signal Boosters. As a result, signal booster manufacturers could not finalize and submit equipment certification applications until those procedures were in place. The 60-day extension will allow for the review and testing of these devices so that they can be properly certified and offered for sale. Additionally, by making more signal booster equipment available, the FCC determined that the use of legacy equipment would be reduced.

FCC Denial of Construction Notice Waiver Request Demonstrates Importance of Meeting FCC Deadlines

Spartanburg County, South Carolina (Spartanburg) requested a waiver of the FCC's Rules to permit the late-filing of a construction notification after the FCC had terminated the authorization through its "Term-Pending" Process. The basis for the Spartanburg's request was that while it had timely constructed its facility, it had inadvertently overlooked the filing of the required construction notification. Unfortunately, because Spartanburg neither timely filed its construction notification nor sought reconsideration of the FCC's proposal to terminate its operational authority, the FCC denied Spartanburg's late-filed construction notification and deemed Spartanburg's operating authority terminated.

On November 28, 2012, the FCC issued a Public Notice which notified Spartanburg that the frequency 807.4625 MHz had been placed in a Termination Pending status due to non-construction. The Public Notice advised Spartanburg that it had 30 days (or until December 28, 2012) within which to file a petition for reconsideration demonstrating that it had timely constructed its facility. Spartanburg did not file a petition for reconsideration as required by the Public Notice. Having missed this dead-line, Spartanburg instead requested a waiver of the FCC's Rules on January 30, 2013, pointing out that it had timely constructed the frequency before the October 25, 2012 deadline.

In reviewing Spartanburg's waiver request, the FCC not-ed that Spartanburg did not demonstrate why it failed to make its construction notification filing in a timely manner or file the required Petition for Reconsideration. As a result, the FCC concluded that Spartanburg's authorization to operate the 807.4625 MHz transmitter automatically terminated.

GE Fluorescent Lighting Ballasts Cause Harmful Interference to Verizon Wireless' 700 MHz LTE Cell Site – Results in Citation to Building Owner

The FCC has cited a building owner of the Ernst & Young Plaza in Los Angeles, California for violations of the FCC's Rules involving the use of GE fluorescent light ballasts that are causing interference to Verizon Wireless' nearby 700 MHz LTE cell site.

Upon receiving complaints from Verizon Wireless, the FCC's Enforcement Bureau notified the Property Manager that the GE light ballasts were causing interference to the Verizon Wireless 700 MHz LTE cell site. At the time of the visit, the FCC provided the Property Manager with a copy of a GE Product Bulletin that indicated that certain GE UltraMax ballasts could "produce unintentionally high-frequency radio emissions that have the potential to cause interference with certain types of radio communications." The Property Manager was directed to investigate the matter and provide an interim report within 30 days and a final report within 60 days. Almost seven months later, because the interference had not been addressed, the FCC investigated the matter further and determined that the light ballasts in use at the Ernst & Young Plaza were covered by the GE Product Bulletin.

Because the light ballasts are unintentional radiators of radio frequencies, they are regulated by the FCC. Light ballasts are regulated by the FCC as industrial, scientific or medical equipment (ISM) and are not permitted to cause harmful interference to any authorized radio service. In the event that harmful interference occurs, the FCC's rules require the operator/user of ISM equipment to promptly take whatever steps may be necessary (including the termination of operations) to eliminate the interference.

In the event that an ISM device causes harmful interference, the operator (or manufacturer of the ISM device) is required to investigate the interference claim and provide an interim report to the FCC's Field office within 30 days of being notified of the harmful interference, followed by a final report 30 days later. Here, it appears that the Property Manager neither resolved the interference issue nor provided the FCC with the required reports – despite being directed to do almost 1 year ago.

The FCC's decision to issue a citation puts the FCC in a position to fine the property owner if it remains uncooperative. The Commission does not have to issue a separate citation before fining entities already regulated by the FCC ( e.g., licensees).

Tower Owner Draws $10,000 Fine For Not Having Lights On During Daylight

The FCC continues to issue fines for tower lighting violations. On February 21, 2014, the FCC issued a Notice of Apparent Liability for Forfeiture against Ohana Media Group, LLC, the owner of a 96 meter antenna tower in Anchorage, Alaska, for $10,000. The proposed fine is due to Ohana's failure to display the required flashing white light at the top of the tower during daytime hours. Ohana elected to have day-time white strobe lighting so that it could avoid painting the tower. This is critical during the day so that the tower is conspicuous to aircraft pilots.

In the course of its investigation, the FCC determined that no only had the top-mounted white light failed, but that Ohana Media also failed to report the light outage to the FAA within 30 minutes, as required by the FCC's Rules.

For those of our clients who own antenna towers with obstruction marking and lighting, it is extremely important that you verify the operation of the obstruction lighting daily. Should there be a failure that cannot be corrected within 30 minutes, it is important that the FAA be promptly notified so that a Notice to Airman (NOTAM) can be issued. You should note that NOTAMs are for limited duration and must be renewed if the outage is not fixed within the required time frame.

FCC Fines Directlink $20,000 for Operating a Transmitter without a License and in Violation of Part 15

The FCC has issued a $20,000 fine against Directlink for operating a transmitter without a license and in a manner that violated Part 15 of the FCC's Rule – which allows the operation of certain unlicensed devices.

In response to an interference complaint from the FAA, the FCC utilized direction finding equipment to determine that interference on the frequency 5630 MHz coming from a U-NII system being operated by Directlink, LLC. The FCC's investigation determined that Directlink's device was authorized to operate within a frequency range of 5745 to 5825 MHz, and that it was improperly operating out of the authorized range on a center frequency at 5630 MHz. Once Directlink adjusted the device's operating frequency from 5630 MHz to 5785 MHz, the interference to the FAA's Denver Terminal Doppler Weather Radar (TWDR) was resolved.

In order to prevent the potential for interference to the FAA's TDWR installations, the FCC requires operators of U-NII devices in the 5.25 – 5.35 GHz and 5.47 – 5.725 GHz bands to have Dynamic Frequency Selection (DFS) radar detection functionality, which allows the device to detect radar systems and prevent co-channel operations with radar systems. During the FCC's inspection, Directlink advised that it was not operating with the required DFS functionality.

In originally proposing a $25,000 fine in the NALF, the FCC noted that the base fine for both violations is $15,000. However, the FCC applied an upward adjustment of $10,000 due to the circumstances and the public safety risks posed by Directlink's operation of an unauthorized system that created interference to the FAA's DTWR radar system at the Denver International Airport. Nonetheless, because Directlink demonstrated a good history of regulatory compliance, the FCC reduced the fine by $5,000 to $20,000.

This newsletter is not intended to provide legal advice. Those interested in more information should contact the firm. For additional information, please contact Hal Mordkofsky at 202-828-5520 or .

black line

Voluntary Newsletter Supporters By Donation

black line

Kansas City


Premium Newsletter Supporter


black line

gcs logo

Premium Newsletter Supporter

black line

Canyon Ridge Communications

canyon ridge

Premium Newsletter Supporter

(Above and beyond the call of duty.)

black line

ProPage Inc.


Newsletter Supporter

black line


The Premium Supporters have made repeated, and generous donations to help keep the newsletter going.

black line

Le Réseau Mobilité Plus
Montreal, Quebec


Newsletter Supporter

black line

Communication Specialists

communication specialists

Newsletter Supporter

black line

Cook Paging

cook paging

Premium Newsletter Supporter

black line



Premium Newsletter Supporter

black line

Citipage Ltd.
Edmonton, Alberta


Newsletter Supporter

black line

black line

The recent "How Quantum is the D-Wave Machine?" Shin paper

Posted on February 4, 2014 by Geordie

Generally I try to avoid commenting on ongoing scientific debates. My view is that good explanations survive scrutiny, and bad explanations do not, and that our role in bringing quantum computers kicking and screaming into the world is to, well, build quantum computers. If people love what we build, and we do everything we can to adjust our approach to make better and better gear for the people who love our computers, we have succeeded and I sleep well.

I am going to make an exception here. Many people have asked me specifically about the recent Shin et. al. paper , and I'd like to give you my perspective on it.

Science is about good explanations

In my world view, science is fundamentally about good explanations.

What does this mean? David Deutsch eloquently describes this point of view in this TED talk . There is a transcript here . He proposes and defends the idea that progress comes from discovering good explanations for why things are the way they are. You would not be reading this right now if we had not come up with good explanations for what electrons are.

We can and should directly apply these ideas to the question of whether D-Wave processors are quantum or classical. From my perspective, if the correct explanation were 'it's classical', that would be critical to know as quickly as possible, because we could then identify why this was so, and attempt to fix whatever was going wrong. That's kind of my job. So I need to really understand this sort of thing.

Here are two competing explanations for experiments performed on D-Wave processors.

Explanation #1. D-Wave processors are inherently quantum mechanical, and described by open quantum systems models where the energy scale of the noise is much less than the energy scale of the central quantum system.

Explanation #2. D-Wave processors are inherently classical, and can be described by a classical model with no need to invoke quantum mechanics.

The Shin et. al. paper claims that Explanation #2 is a correct explanation of D-Wave processors. Let's examine that claim.

Finding good explanations for experimental results

It is common practice that whenever an experiment is reported demonstrating quantum mechanical (or in general non-classical) effects, researchers look for classical models that can provide the same results. A successful theory, however, needs to explain all existing experimental results and not just a few select ones. For example, the classical model of light with the assumption of ether could successfully explain many experiments at the beginning of the 20th century. Only a few unexplained experiments were enough to lead to the emergence of special relativity.

In the case of finding good explanations for the experimental results available for D-Wave hardware, there is a treasure trove of experimental data available. Here is just a small sample. There are experimental results available on single qubits ( Macroscopic Resonant Tunneling & Landau-Zener ), two qubits (cotunneling) and multiple qubits (now up to about 500) (the eight qubit Nature paper , entanglement , results at 16 qubits , the Boixo paper ).

Let's see what we get when we apply our two competing explanations of what's going on inside D-Wave processors to all of this data.

If we assume Explanation #1, we find that a single simple quantum model perfectly describes every single experiment ever done. In the case of the simpler data sets, experimental results agree with quantum mechanics with no free parameters, as it is possible to characterize every single term in the system's Hamiltonian, including the noise terms.

Explanation #2 however completely fails on every single experiment listed above, except for the Boixo data (I'll give you an explanation of why this is shortly). In particular, the eight qubit quantum entanglement measured in Lanting et al. can never be explained by such a model , which rules it out as an explanation of the underlying behavior of the device . Note that this is a stronger result than it's simply a bad explanation — the model proposed in Shin et. al. makes a prediction about an experiment that you can easily perform on D-Wave processors that contradicts what is observed.

Why the model proposed works in describing the Boixo data

Because the Shin et. al. model makes predictions that contradict the experimental data for most of the experiments that have been performed on D-Wave chips, it is clearly not a correct explanation of what's going on inside the processors. So what's the explanation for the agreement in the case of the Boixo paper? Here's a possibility, which we can test.

The experiment performed in the Boixo et. al. paper considered a specific use of the processors. This use involved solving a specifically chosen type of problem. It turns out that for this type of problem, multi-qubit quantum dynamics and therefore entanglement are not necessary for the hardware to reach good solutions . In other words, for this experiment, a Bad Explanation (a classical model) can be concocted that matches the results of a fully quantum system.

To be more specific, the Shin et. al. model replaces terms like J_{ij} \sigma^z_i \sigma^z_j with J_{ij} <\sigma^z_i><\sigma ^z_j> , where \sigma^z_i is a Pauli matrix and <\sigma^z_i> is the quantum average of \sigma^z_i . Since all quantum correlations are gone after such averaging, you can model <\sigma^z_i> as a classical magnetic moment in a 2D plane. But now it is clear that any experiments relying on multi-qubit quantum correlation and entanglement cannot be explained by this simple model.

I've proposed an explanation for the agreement between the Shin model and this particular experiment — that the hardware is fundamentally quantum, but for the particular problem type run, this won't show up because the problem type is 'easy' (in the sense that good solutions can be found without requiring multi-qubit dynamics, and an incorrect classical model can be proposed that nevertheless agrees with the experimental data).

How do we test this explanation? We change the problem type to one where a fundamental difference in experimental outcome between the processor hardware and any classical model is expected. If the Shin et. al. model continues to describe what is observed in that situation, then we have a meaningful result that disagrees with the 'hardware is quantum' explanation. If it disagrees with experiment, that supports the 'hardware is quantum' and the 'type of problem originally studied is expected to show the same experimental results for quantum and classical models so it's just a bad choice if that's your objective' explanations.

So a very important test to help determine what is truly going on is to make this change, measure the results and see what's up. I believe that some of the folks working on our systems are doing this now. Looking forward to seeing the results!

The best explanation we have now is that D-Wave processors are beautifully quantum mechanical

The explanation that D-Wave processors are fundamentally quantum mechanical beautifully explains every single experiment that has ever been performed on them. The degree of agreement is astonishing. The results on the smallest systems, such as the individual qubits, are like nothing I've ever seen in terms of agreement of theory and experiment. Some day these will be in textbooks as examples of open quantum systems.

No classical model has ever been proposed that simultaneously explains all of the experiments listed above.

The specific model proposed in Shin focuses only on one experiment for which there was no expectation of an experimental difference between quantum and classical models and completely (and from my perspective disingenuously) ignores the entire remainder of the mountains of experimental data on the device.

For these reasons, the Shin results have no validity and no importance.

As an aside, I was disappointed when I saw what they were proposing. I had heard through the grapevine that Umesh Vazirani was preparing some really cool classical model that described the data referred to above and I was actually pretty excited to see it.

When I saw how trivially wrong it was it was like opening a Christmas present and getting socks.

Geordie Rose

Burnaby, BC

I'm the chief technology officer of D-Wave and 2010 NAGA Brazilian jiu-jitsu light heavyweight world champion.

Source: D-Wave

black line

Friends & Colleagues

black line

Ira Wiesenfeld, P.E.

black line

Complete Technical Services For The Communications and Electronics Industries Design • Installation • Maintenance • Training • Engineering • Licensing • Technical Assistance

black line

Ira Wiesenfeld, P.E.
Consulting Engineer
Registered Professional Engineer

Tel/Fax: 972-960-9336
Cell: 214-707-7711
7711 Scotia Dr.
Dallas, TX 75248-3112

black line

Ira Wiesenfeld, P.E.

black line

subscribe free

black line

Wireless Network Planners

black line

Wireless Network Planners
Wireless Specialists

R.H. (Ron) Mercer
217 First Street South
East Northport, NY 11731
ron mercer

Cellphone: 631-786-9359

black line

Wireless Network Planners

black line

black line

Prism Paging

black line

white line


white line


  • VoIP telephone access — eliminate interconnect expense
  • Call from anywhere — Prism SIP Gateway allows calls from PSTN and PBX
  • All the Features for Paging, Voice-mail, Text-to-Pager, Wireless and DECT phones
  • Prism Inet, the new IP interface for TAP, TNPP, SNPP, SMTP — Industry standard message input
  • Direct Connect to NurseCall, Assisted Living, Aged Care, Remote Monitoring, Access Control Systems

black line

D-Wave, disentangled: Google explains the present and future of quantum computing

By Joel Hruska on February 26, 2014 at 12:00 pm

dwave2 The performance and quantum nature of the D-Wave 2 processor continues to be a topic of discussion with every new data release, and the performance figures that Google released in late January were no exception. The company has now followed up these figures with a second blog post that describes its own interpretation of the results, what it intends to test next, and what the future of the program is likely to be.

The key question at the heart of the D-Wave enigma is whether or not the system is actually performing quantum annealing . In theory, the D-Wave 2 processor could be an excellent simulation of a quantum computer — possibly the best simulation ever built — but still, ultimately, an approximation of what the real thing would offer. The only way to determine whether or not the D-Wave performs true quantum annealing is to find a test case in which the D-Wave 2 outperforms even the best classical (meaning, standard) computers.

Google's last set of data indicated that while the D-Wave 2 outperformed off-the-shelf classical software by huge margins, hand-tuned classical computer configurations running on Nvidia GPUs were capable of competing with the quantum computer in a number of specific benchmarks. According to Google's engineers, this close performance is an artifact of the primitive state of current quantum annealers.

The D-Wave 2 is limited by what's called "sparse connectivity," as shown below.

Note that while each sub-group of eight qubits is tightly linked to its adjacent partners, the blocks themselves connect in far fewer places. This limits the performance of the quantum annealer because it limits the number of states that the quantum computer can test in order to find the ideal solution to the problem. This is a separate problem from the number of qubits in the system (up to 509 out of a possible 512 in this machine) — it's an issue of how interconnected the 509 functional qubits are.

According to Google , it's this sparse connectivity that's allowing classical computers to keep pace with D-Wave's quantum system. The company writes that, "For each solver, there are problems for which the classical solver wins or at least achieves similar performance. But the inverse is also true. For each classical solver, there are problems for which the hardware does much better ."

Echoes of the past

The current debate over the merits of quantum annealing and the relative performance advantage of classic computers versus D-Wave's system is somewhat similar to the debates over digital vs. analog computing of the mid-20th century . From this end of history, it may look as though digital technology was an unstoppable wave that simply buried older, more primitive methods of computation — but this glosses over historical fact.

Image: Popular Electronics .
Doctor Frances Baurer with the analog computer (Project Cyclone)

Electronic analog computers were initially far faster than their digital counterparts. They operated in parallel, whereas digital systems performed operations sequentially. They were capable of higher levels of precision (0.1% as compared to the 1% margin of error within the first digital systems). In the end, digital computers won out over analog — but the two types of systems co-existed at various levels for several decades.

Just as early digital systems were matched or outperformed in many respects by well-developed analog computers, it's possible that D-Wave's first quantum computing efforts can be matched or exceeded by well-tuned classical systems. In fact, given the hundreds of billions of dollars poured in to the development of modern computers, it would be astonishing if scientists invented a new computing solution capable of beating conventional equipment in all respects in just a handful of years.

Source: ExtremeTech

black line

black line

WiPath Communications

black line

wipath header

Intelligent Solutions for Paging & Wireless Data

WiPath manufactures a wide range of highly unique and innovative hardware and software solutions in paging and mobile data for:

  • Emergency Mass Alert & Messaging
  • Emergency Services Communications
  • Utilities Job Management
  • Telemetry and Remote Switching
  • Fire House Automation
  • Load Shedding and Electrical Services Control

black line

PDT3000 Paging Data Terminal

pdt 2000 image

  • Built-in POCSAG encoder
  • Huge capcode capacity
  • Parallel, 2 serial ports, 4 relays
  • Message & system monitoring

black line

Paging Controlled Moving Message LED Displays

welcom wipath

  • Variety of sizes
  • Indoor/outdoor
  • Integrated paging receiver

black line

PDR3000/PSR3000 Paging Data Receivers

paging data receiver

  • Highly programmable, off-air decoders
  • Message Logging & remote control
  • Multiple I/O combinations and capabilities
  • Network monitoring and alarm reporting

black line

Specialized Paging Solutions

paging data receiver

  • Emergency Mass Alerting
  • Remote telemetry switching & control
  • Fire station automation
  • PC interfacing and message management
  • Paging software and customized solutions
  • Message interception, filtering, redirection, printing & logging Cross band repeating, paging coverage infill, store and forward
  • Alarm interfaces, satellite linking, IP transmitters, on-site systems

black line

Mobile Data Terminals & Two Way Wireless  Solutions

mobile data terminal

radio interface

  • Fleet tracking, messaging, job processing, and field service management
  • Automatic vehicle location (AVL), GPS
  • CDMA, GPRS, ReFLEX, conventional, and trunked radio interfaces

black line

WiPath Communications LLC
4845 Dumbbarton Court
Cumming, GA 30040
4845 Dumbbarton Court
Cumming, GA 30040
Web site: left arrow CLICK
E-mail: left arrow CLICK
WiPath Communications

black line

black line

Hark Technologies

black line

hark logo

Wireless Communication Solutions

black line

USB Paging Encoder

paging encoder

  • Single channel up to eight zones
  • Connects to Linux computer via USB
  • Programmable timeouts and batch sizes
  • Supports 2-tone, 5/6-tone, POCSAG 512/1200/2400, GOLAY
  • Supports Tone Only, Voice, Numeric, and Alphanumeric
  • PURC or direct connect
  • Pictured version mounts in 5.25" drive bay
  • Other mounting options available
  • Available as a daughter board for our embedded Internet Paging Terminal (IPT)

black line

Paging Data Receiver (PDR)


  • Frequency agile—only one receiver to stock
  • USB or RS-232 interface
  • Two contact closures
  • End-user programmable w/o requiring special hardware
  • 16 capcodes
  • Eight contact closure version also available
  • Product customization available

black line

Other products

black line

Please see our web site for other products including Internet Messaging Gateways, Unified Messaging Servers, test equipment, and Paging Terminals.

Hark Technologies
717 Old Trolley Rd Ste 6 #163
Summerville, SC 29485
Tel: 843-821-6888
Fax: 843-821-6894
E-mail: left arrow CLICK
Web: left arrow CLICK

hark David George and Bill Noyes
of Hark Technologies.

black line

Hark Technologies

black line


Click on the logo above for more info.

black line

Quantum computing 101

Although quantum information has been around for a long time, we're starting to see more about it in the media. We hope to give you a quickstart guide on:

What is quantum computing?

Quantum computing is essentially harnessing and exploiting the amazing laws of quantum mechanics to process information. A traditional computer uses long strings of "bits," which encode either a zero or a one. A quantum computer, on the other hand, uses quantum bits, or qubits. What's the difference? Well a qubit is a quantum system that encodes the zero and the one into two distinguishable quantum states. But, because qubits behave quantumly, we can capitalize on the phenomena of "superposition" and "entanglement."

Superposition and entanglement? Pardon?

It's OK to be a bit baffled by these concepts, since we don't experience them in our day-to-day lives. It's only when you look at the tiniest quantum particles — atoms, electrons, photons and the like — that you see intriguing things like superposition and entanglement.

Superposition is essentially the ability of a quantum system to be in multiple states at the same time — that is, something can be "here" and "there," or "up" and "down" at the same time.

Entanglement is an extremely strong correlation that exists between quantum particles — so strong, in fact, that two or more quantum particles can be inextricably linked in perfect unison, even if separated by great distances. The particles remain perfectly correlated even if separated by great distances. The particles are so intrinsically connected, they can be said to "dance" in instantaneous, perfect unison, even when placed at opposite ends of the universe. This seemingly impossible connection inspired Einstein to describe entanglement as "spooky action at a distance."

Why do these quantum effects matter?

First of all, they're fascinating. Even better, they'll be extremely useful to the future of computing and communications technology.

Thanks to superposition and entanglement, a quantum computer can process a vast number of calculations simultaneously. Think of it this way: whereas a classical computer works with ones and zeros, a quantum computer will have the advantage of using ones, zeros and "superpositions" of ones and zeros. Certain difficult tasks that have long been thought impossible (or "intractable") for classical computers will be achieved quickly and efficiently by a quantum computer.

What can a quantum computer do that a classical computer can't?

Factoring large numbers, for starters. Multiplying two large numbers is easy for any computer. But calculating the factors of a very large (say, 500-digit) number, on the other hand, is considered impossible for any classical computer. In 1994, a mathematician from the Massachusetts Institute of Technology (MIT) Peter Shor, who was working at AT&T at the time, unveiled that if a fully working quantum computer was available, it could factor large numbers easily.

But I don't want to factor very large numbers…

Nobody wants to factor very large numbers! That's because it's so difficult — even for the best computers in the world today. In fact, the difficulty of factoring big numbers is the basis for much of our present day cryptography. It's based on math problems that are too tough to solve. RSA encryption, the method used to encrypt your credit card number when you're shopping online, relies completely on the factoring problem. The website you want to purchase from gives you a large "public" key (which anyone can access) to encode your credit card information.

This key actually is the product of two very large prime numbers, known only to the seller. The only way anyone could intercept your information is to know those two prime numbers that multiply to create the key. Since factoring is very hard, no eavesdropper will be able to access your credit card number and your bank account is safe. Unless, that is, somebody has built a quantum computer and is running Peter Shor's algorithm!

Wait… so a quantum computer will be able to hack into my private data? That's not good.

Don't worry — classical cryptography is not completely jeopardized. Although certain aspects of classical cryptography would be jeopardized by quantum computing, quantum mechanics also allows for a new type of highly secure cryptography.

Let's look at a common cryptographic protocol called the one-time pad: Say party A and party B (let's call them Alice and Bob) share a long string of random zeros and ones — the secret key. As long as they only use this key once and they are the only ones who know this key, they can transmit a secret message such that no eavesdropper (we'll call her Eve) will be able to decipher the message. The main difficulty with the one-time pad is the actual distribution of the secret key. In the past, governments sent people to exchange books full of random data to be used as keys. That, of course, is impractical and imperfect. This is where quantum mechanics comes in very handy once again: Quantum Key Distribution (QKD) allows for the distribution of completely random keys at a distance.

How can quantum mechanics create these ultra-secret keys?

Quantum key distribution relies on another interesting property of quantum mechanics: any attempt to observe or measure a quantum system will disturb it.

The Institute for Quantum Computing (IQC) is home of one of the few QKD prototypes in the world. "Alice," a device located at IQC headquarters, receives half of the entangled (highly correlated) one of the photons generated by a laser on the roof of a building at the University of Waterloo. "Bob" is housed at the nearby Perimeter Institute, and receives the other half of the entangled photons.

Photons have a unique measurable property called polarization (which should sound familiar to any connoisseur of sunglasses).

Since the polarization of each individual photon is random, there's no way of knowing the unique properties of each photon in advance. But here is where entanglement becomes interesting: if Alice and Bob measure the polarization of the entangled photons they receive, their results will be the same (remember, "entangled" means the particles are highly correlated with each other, even at great distances). Depending on the polarization of each photon, Alice and Bob ascribe either a "one" or a "zero" to each photon they receive. Therefore, if Alice gets a string like 010110, Bob also gets a 010110. Unless, that is, an eavesdropper has been attempting to spy on the signal. This will disturb the system, and Alice and Bob will instantly notice that their keys don't match.

Alice and Bob keep receiving photons until their identical keys are long and identical enough and, presto, they've got ultra-secure keys for encrypting communications.

So harnessing the quantum world can break and make codes. Anything else?

Plenty. For example, quantum computers will be able to efficiently simulate quantum systems, which is what famous physicist Richard Feynman proposed in 1982, effectively kick-starting the field. Simulation of quantum systems has been said to be a "holy grail" of quantum computing: it will allow us to study, in remarkable detail, the interactions between atoms and molecules. This could help us design new drugs and new materials, such as superconductors that work at room temperature. Another of the many tasks for which the quantum computer is inherently faster than a classical computer is at searching through a space of potential solutions for the best solution. Researchers are constantly working on new quantum algorithms and applications. But the true potential of quantum computers likely hasn't even been imagined yet. The inventors of the laser surely didn't envision supermarket checkout scanners, CD players and eye surgery. Similarly, the future uses of quantum computers are bound only by imagination.

Sounds great! Where can I get a quantum computer?

Not so fast. While quantum computers have been theoretically demonstrated to have incredible potential, and scientists are working at IQC and around the world to realize that potential, there is much work to be done before quantum computers hit the market.

What is required to build a quantum computer?

Simply put: we need qubits that behave the way we want them to. These qubits could be made of photons, atoms, electrons, molecules or perhaps something else. Scientists at IQC are researching a large array of them as potential bases for quantum computers. But qubits are notoriously tricky to manipulate, since any disturbance causes them to fall out of their quantum state (or "decohere"). Decoherence is the Achilles heel of quantum computing, but it is not insurmountable. The field of quantum error correction examines how to stave off decoherence and combat other errors. Every day, researchers at IQC and around the world are discovering new ways to make qubits cooperate.

So when will there be a real quantum computer?

It depends on your definition. There are quantum computers already, but not of sufficient power to replace classical computers. A team of researchers from IQC and MIT hold the current world record for the most number of qubits used in an experiment (12). While practical quantum technologies are already emerging — including highly effective sensors, actuators and other devices — a true quantum computer that outperforms a classical computer is still years away. Theorists are continually figuring out better ways to overcome decoherence, while experimentalists are gaining more and more control over the quantum world through various technologies and instruments. The pioneering work being done today is paving the way for the coming quantum era.

So quantum technology is still years away?

No, quantum technologies are already in use! QKD is already commercially available, and will greatly benefit from new research (scientists at IQC are currently pursuing quantum encryption through free space via satellite). Although a fully functioning quantum computer is a longer-term goal, many fundamental and practical discoveries have been made in the name of quantum computing. Quantum sensors and actuators will allow scientists to navigate the nano-scale world with remarkable precision and sensitivity. Such tools will be invaluable to the development of true quantum information processors. The quantum revolution is already under way, and the possibilities that lie ahead are limitless.

Source: University of Waterloo — Institute for Quantum Computing

black line


black line

The Wireless Messaging News

Best regards,
brad's signature
Newsletter Editor

Brad Dye
P.O. Box 266
Fairfield, IL 62837 USA

mensa member animated gif

Skype: braddye
Twitter: @BradDye1
Telephone: 618-599-7869
Wireless: Consulting page
Paging: Home Page
Marketing & Engineering Papers
K9IQY: Ham Radio Page

Back To Paging
Still The Most Reliable Wireless Protocol For Emergencies!

wireless logo medium

black line


black line

He (or she) is Gone

You can shed a tear that he is gone,
or you can smile because he has lived.

You can close your eyes and pray he'll come back,
or you can open your eyes and see all he's left.

Your heart can be empty because you can't see him,
or you can be full of the love you shared.

You can turn your back on tomorrow and live for yesterday,
or you can be happy for tomorrow because of yesterday.

You can remember only that he's gone,
or you can cherish his memory and let it live on.

You can cry and close your mind, be empty and turn your back,
or you can do what he'd want: smile, open your eyes love and go on.

black line



* required field

If you would like to subscribe to the newsletter just fill in the blanks in the form above, and then click on the “Subscribe” bar.

black line

left arrow Newspapers generally cost 75¢ $1.50 a copy and they hardly ever mention paging or wireless messaging. If you receive some benefit from this publication maybe you would like to help support it financially? A donation of $50.00 would certainly help cover a one-year paid subscription. If you are wiling and able, please click on the PayPal Donate button on the left. Any amount will be sincerely appreciated.

black line

black line

Home Page | Directory | Consulting | Newsletters
Products | Reference | Glossary | Send e-mail