Automotive and Technology

Nowadays, children, besides of going to the school, practice a sport, mostly a team sport. And just like so many things, this also has its advantages and disadvantages on physical, emotional and mental field.

The decision of which one is greater, the advantages or the disadvantages depend on the point of view of the parents and of the children.

Emotional advantages:

– They will learn the value of teamwork
– They will learn how to work in a team
– They will develop their ability of trusting someone else
– They will learn how to select the trustful people
– They will make more friends

Emotional disadvantages:

– the pressure of being the best
– the possibility of not being the best
– possible emotional implications of not being the best can affect the self-esteem of a child

Mental advantages:

– physical activity stimulates the brain
– sport is a way of relaxing of the stress and pressure from the school, therefore, after exercises, the lessons will be automatically easier to understand and to learn

Mental disadvantages:

– competitive sport takes precedence over a child's education
– learning may be neglected
– lack of mentally challenge

Physical advantages:

– they will learn the benefits of being fit, limber and strong
– it ensures a great and healthy growth of the body of the child
– they will be protected by obesity

Physical disadvantages:

– the danger of long-term injury
– early overuse of joints, ligaments and muscles can lead to joint and tendon problems, arthritis, back and neck problems

These issues have to be considered again and again before deciding on the type of sport that your child will be practicing.

After the performance and features of the cell phone, nothing is more important than the life of the cellular phone battery. The longer the battery life, the more talking before it needs a recharge. Sometimes your cellular phone battery will wear out. It will not hold its charge as long and your talk time will be noticeably diminished. At that point it's time to get a new battery. Make sure you get the correct battery for your cell phone model. If it's the wrong one, it will not fit.

There are actually three different types of materials that make a cellular phone battery. The metals are nickel metal hydride, nickel cadmium and lithium ion. The lithium ion cellular phone battery is the most common because it has the longest battery life, about 3 to 5 hours of talk time. They can be recharged around 200 times before you need to purchase a new one. When you replace your cellular phone battery, make sure to dispose of the old one at the correct hazardous waste place or at a store that will accept old batteries to be disposed.

The cellular phone battery should be stored at room temperature as extremes in heat and cold can damage it. Wait till the battery has completely run dry before recharging. These batteries have a memory, which means that if it normally would have 2 hours of talk time, and you recharged at 1 h hours of talk time, then that's the max charging the battery will accept now. You'll be stuck at 1 h hours of talk time. Read the instructions on charging and recharging the cellular phone battery and make sure you do not overcharge as well.

When purchasing a replacement cellular phone battery, I would suggest buying only OEM (original equipment manufacturer). There have been problems with third party cell phone batteries blowing up. They were sold and made specifically for a certain phone model, but if the battery was manufactured incorrectly it will overheat and blow up during charging. OEM batteries are more expensive, but I think they are well worth the cost.

You can charge up your cellular phone battery with an AC adapter (desktop charger) or car adapter charger. You will generally have to buy the car adapter separately but but I have found it to be invaluable on trips. You can buy the car charger from the same store that you bought the cell phone. Make sure you check the make and model number so that you buy the correct one for your phone.

The most basic definition of gasification is that it is any chemical or heat process used to convert a substance to a gas.

Coal has been gasified ever since the industrial revolution to produce "town gas". This was once done in the local gas works, and every town had one. Heating the coal under controlled conditions with insufficient air to provide complete combustion produces a gaseous fuel known as syngas, which is also what is known as town gas when cooled cleaned and compressed. As we all know using gas as a fuel for so many jobs is vastly more controllable and vastly preferred to using coal.

Gasification technology is at the forefront in the efforts to develop alternatives for conventional furnaces. It is of particular interest because it offers an opportunity to use the product fuel gas in integrated gasification combined-cycle electric power generation (IGCC). Great opportunities are pinned on IGCC as a highly efficient and low polluting emissions technology.

Gasification can also be fueled by materials that are not otherwise useful fuels, such as biomass or organic waste. In addition, it also solves many worries about reducing air quality. This is because the high temperature conversion reaction essential to the process also refines out corrosive ash elements such as chloride and potassium, allowing clean gas production.

Furthermore, many have reported that using their technology product gas heating (calorific) value can be made stable regardless of changes in feedstock type, ash content, or moisture content.

In some types of gasification plant, gasification takes place on the three by-products of pyrolosis and uses them to fuel a second reaction by concentrating the heat onto a bed of charcoal. These coals normally reach 1800+ degrees F, in the gasifier, which is hot enough to break the water vapor into hydrogen, and the CO2 into carbon monoxide.

Gasification is extremely environmentally friendly in that if properly designed, gasification systems produce very minimal pollution even when processing dirty feedstocks, such as high sulfur coals. In addition, gasification can effect large volume reductions in solid wastes while producing an environmentally friendly inert slag-type byproduct.

Jan Becker, Technical Director, of a US energy company growing fast on its gasification skills, added that; "The gasifier is becoming an important factor in the race toward the 'greening of America' as there is more and more awareness that many of the substances that America throws away can be gasified and then made into useful products like electricity, ethanol, methanol, And bio-diesel. "

The gas produced by gases (mainly consisting of 15-25% carbon monoxide, 10-20% hydrogen and 1-5% methane) is combusted in special burners for maximum efficiency. The best high quality gasifier systems can be fed on what are otherwise just low-grade waste oils or tar oils and slurries. Some slurry fed, O2 blown, entrained gasifiers operate at between 2400 degrees F and 2700 degrees F.

In these latest high technology systems high pressure steam is produced for internal and local CHP (Combined Heat and Power) use, by cooling the syngas in a radiant syngas cooler and then using (in this case) two parallel fire tube conveve syngas coolers.

That is the top of the range in technological development. At the bottom. Most basic, level of he gasifier it is really simple. Simply wood burning gasifier stoves can be made to designs freely available as templates which can be made almost completely from junk parts found in various trash bins.

Gasifiers are now also available which are intended for the processing of biomass and organic waste, and this has been found viable at current oil prices, when it is considered that the numerical calculations are based on low-grade coal. It has also been shown shown that the process can be both stable and controllable. New designs can be assessed in advance by taking an assessment of the numerically derived analysis produced by RESORT software, in order to predict the physical and chemical processes in the gasifier. The Euler-Lagrange approach for gas and particle phase is employed and Navier-Stokes ratios are analyzed by the finite volume method.

This module focuses on the basics of Knowledge Mapping, its importance, principles, and methodologies.

Key Questions

  • What is K-map?
  • What does the K-map show, and do do we map?
  • Why is K-mapping so important?
  • What are some of the key principles, methods, and questions for K-mapping?
  • How do we create K-map?

Background

Each of the past centuries has been dominated by single technology. The eighth century was the time of the great mechanical systems involving the Industrial Revolution. The nineteenth century was the age of steam engine. After these, the key technology has been information gathering, processing and distribution. Among other developments, the installation of world wide telephone networks, the invention of radio and television, the birth and unpresented growth of the computer industry and the launching of communication satellites are significant. Now people started to think that only information is not enough, what matters is Knowledge. So there has been seen a shift from Information to Knowledge.

A bit of information without context and interpretation is data such as numbers, symbols.

Information is a set of data with context and interpretation. Information is the basis for knowledge.

Knowledge is a set of data and information, which which is added expert opinion and experience, to result in a valuable asset which can be used or applied to aid decision making. Knowledge may be explicit and / or tacit, individual and / or collective.

The term-Knowledge Mapping- seems to be relatively new, but it is not. We have been practicing this in our everyday life, just what we are not doing is – we are not documenting it, and we are not doing it in a systematic way. Knowledge Mapping is all about keeping a record of information and knowledge you need such as where you can get it from, who holds it, who expertise is it, and so on. Say, you need to find something at your home or in your room, you can find it in no time because you have almost all the information / knowledge about -what is where- and -who knows what- at your home. It is a sort of map set in your mind about your home. But, to set such a map about your organization and organizational knowledge in your mind is almost impossible. This is where K-map becomes handy and shows details of every bit of knowledge that exists within the organization including location, quality, and accessibility; And knowledge required to run the organization smoothly – since making you able to find out your required knowledge easily and efficiently.

Below are some of the definitions:

It's an ongoing quest within an organization (including its supply and customer chain) to help discover the location, ownership, value and use of knowledge artifacts, to learn the roles and expertise of people, to identify constants to the flow of knowledge, and to Highlight opportunities to leverage existing knowledge.

Knowledge mapping is an important practice consulting of survey, audit, and synthesis. It aims to track the acquisition and loss of information and knowledge. It explores personal and group competencies and proficiencies. It illustrates or "maps" how knowledge flows through an organization. Knowledge mapping helps an organization to appreciate how the loss of staff influences intellectual capital, to assist with the selection of teams, and to match technology to knowledge needs and processes.

– Denham Gray

Knowledge mapping is about making knowledge that is available within an organization transparent, and is about providing the insights into its quality.

– Willem-Olaf Huijsen, Samuel J. Driessen, Jan WM Jacobs

Knowledge mapping is a process by which organizations can identify and categorize knowledge assets within their organization – people, processes, content, and technology. It allows an organization to fully leverage the existing expert residency in the organization, as well as identify barriers and constraints to fulfilling strategic goals and objectives. It is constructing a roadmap to locate the information needed to make the best use of resourses, independent of source or form.

-W. Vestal, APQC, 2002

(American Productivity & Quality Center)

Knowledge Map describes what knowledge is used in a process, and how it flows around the process. It is the basis for determining knowledge commonality, or areas where similar knowledge is used across multiple process. Fundamentally, a process knowledge map cntains information about the organization? S knowledge. It describes who has what knowledge (tacit), where the knowledge resides (infrastructure), and how the knowledge is transferred or disseminated (social).

-IBM Global Services

How are the Knowledge Maps created?

Knowledge maps are created by transferring tacit and explicit knowledge into graphical formats that are easy to understand and interpret by the end users, who may be managers, experts, system developers, or anyone.

Basic steps in creating K-maps:

Basic steps – creating K-maps for specific task

  • The outcomes of the entire process, and their contributions to the key organizational activities
  • Logical sequences of all the activities needed to achieve the goal
  • Knowledge required for each activity {gives the knowledge gap}
  • Human resource required to undertake each activity {shows if recruitment is needed}

What do we map?

The followings are the objects we map:

  • Explicit knowledge
    • Subject
    • Purpose
    • Location
    • Format
    • Ownership
    • Users
    • Access right
  • Tacit knowledge
    • Expertise
    • Skill
    • Experience
    • Location
    • Accessibility
    • Contact address
    • Relationships / networks
  • Tacit organic process knowledge
    • The people with the internal processing knowledge
  • Explicit organizational process knowledge
    • Codified organizational process knowledge

What do the knowledge maps show?

Knowledge map shows the sources, flows, constitutions, and sinks of knowledge within an organization. It is a navigational aid to both explicit information and tacit knowledge, showing the importance and the relationships between knowledge stores and the dynamics. The following list will be more illustrative in this regard:

  • Available knowledge resources
  • Knowledge clusters and communities
  • Who uses what knowledge resources
  • The paths of knowledge exchange
  • The knowledge lifecycle
  • What we know we don? T know (knowledge gap)

Activity: 1

>> Can you create your personal knowledge map which shows the types and location of knowledge resources you use, the channels you use to access knowledge?

Where does knowledge tear?

Knowledge can be found in

  • Correspondents, internal documents
  • Library
  • Archives (past project documents, proposals)
  • Meetings
  • Best practices
  • Experience
  • Corporate memory

Activity: 2

>> What are the other places where you can find knowledge?

What are the other things to be mapped?

Benefits of K-mapping

In many organizations there is a lack of transparency of organization wide knowledge. Valuable knowledge is often not used because people do not know it exists, even if they know the knowledge exists, they may not know where. These issues lead to the knowledge mapping. Followings are some of the key reasons for doing the knowledge mapping:

  • To find key sources of knowledge creation
  • To encourage reuse and prevent reinvention
  • To find critical information quickly
  • To highlight islands of expertise
  • To provide an inventory and evaluation of intellectual and intangible assets
  • To improve decision making and problem solving by providing applicable information
  • To provide insights into corporate knowledge

The map also serves as the continuing evolving organizational memory, capturing and integrating the key knowledge of an organization. It enables employees learning through intuitive navigation and interpretation of the information in the map, and through the creation of new knowledge through the discovery of new relationships. Simply speaking, K-map gives employees not only -know what-, but also -know how-.

Key principles of Knowledge Mapping

  • Because of their power, scope, and impact, the creation of organizational-level knowledge map requires senior management support as well as careful planning
  • Share your knowledge about identifying, finding, and tracking knowledge in all forms
  • Recognize and locate knowledge in a wide variety of forms: tacit, explicit, formal, informal, codified, personalized, internal, external, and permanent
  • Knowledge is found in processes, relationships, policies, people, documents, conversations, links and context, and even with partners
  • It should be up-to-date and accurate

K-mapping – key questions

Knowledge map provides an assessment of existing and required knowledge and information in the following categories:

  • What knowledge is needed for work?
  • Who needs what?
  • Who has it?
  • Where does it benefit?
  • Is the knowledge tacit or explicit?
  • What issues does it address?
  • How to make sure that the K-mapping will be used in an organization?

Note:

  • K-maps should be easily accessible to all in the organization
  • It should be easy to understand, update and evolve
  • It should be regularly updated
  • It should be an ongoing process since knowledge landscapes are continuously shifting and evolving

Offline Readings:

  • K-mapping tools
  • K-mapping tool selection
  • Creating knowledge maps by exploiting dependent relationships
  • Creating knowledge structure map?
  • White pages
  • KM jargon and glossary

Online Resource: http: //www..voght.com/cgi-bin/pywiki? KnowledgeMapping

K-mapping Tools:

  • MindMapping
  • Inspiration
  • IHMC (cmap.ihmc.us/) (need to have.NET Framework and JavaRunTime installed in your computer)

(Learn more about KM tool selection at http://www.voght.com/cgi-bin/pywiki?KmToolSelection )
________________________________________

Categorised K-mapping

Social Network Mapping:

This shows networks of knowledge and patterns of interaction among members, groups, organizations, and other social entities who knows who, who goes to what for help and advice, where the information enters and leaves the groups or organization, which forums and communities of practice Are operational and generating new knowledge.

Competency Mapping:

With this kind of mapping, one can create a competency profile with skill, positions, and even career path of an individual. And, this can also be converted into the? Organizational yellow pages? Which enables employees to find needed expertise in people within the organization.

Process-based Knowledge Mapping:

This shows knowledge and sources of knowledge for internal as well as external organizational processes and procedures. This includes tacit knowledge (knowledge in people such as know-how, and experience) and explicit knowledge (codified knowledge such as that in document).

Conceptual Knowledge Mapping:

Also sometimes called -taxonomy-, it is a method of hierarchically organizing and classifying content. This involves in labeling pieces of knowledge and relationships between them. A concept can be defined as any unit of thought, any idea that forms in our mind [Gertner, 1978]. Often, nouns are used to refer to concepts [Roche, 2002]. Relations form a special class of concepts [Sowa, 1984]: they describe connections between other concepts. One of the most important relationships between concepts is the hierarchical relation (subsumption), in which one concept (superconcept) is more general than another concept (subconcept) like Natural Resource Management and Watershed Management. This mapping should be able to relate similar kind of projects and workshops conducting / connected by two different departments, making them more integrated.

Knowledge is power, broadly accessible, understandable, and shared knowledge is even more powerful!

It’s not unusual, in fact it’s becoming the norm in the MLB and other professional sports, for players to consult and work with “mental performance coaches.” I feel I can speak for the majority of these athletes when I say 99% of them wishes they had known and worked on these mental skills much earlier in their careers.

So why did these supreme athletes ignore the benefits of mental performance enhancement, imagery and focus training, to name a few. I don’t believe they ignored it… they weren’t aware of it and that’s our fault not theirs.

Many coaches are mired in the rut of thinking mental performance training shouldn’t be considered until players are much older and only at very high skill levels of play.

I wonder how many young players were never able to reach their full potential due to mental performance issues? And for those who were able to attain higher levels of play in their sport, how many spent more agonizing times than were necessary battling performance issues which could have been alleviated through mental actions.

Even if the kid obviously doesn’t possess the skills to reach high level competition in the sport, the experience will help him in other aspects of his life.

So why would a parent paying big bucks for private lessons, a player putting in exhaustive hours of practice and a coach pouring over films and stats, ignore, or at least not put forth the effort to learn and understand the use and need of sports psychology?

I believe they just don’t know how to overcome a few mental obstacles of their own when dealing with sports psychology. For instance.

Misconception #1 “I don’t need it.”

When players are playing well, in the zone as it’s known, the idea of needing any help, especially mental help, is the farthest thing from their mind. However, if the player was more mature he’d know every player has their ups and downs, and if they could identify and harness the emotions and thoughts they are feeling while playing well, they could call on them for help when the time comes they’re not playing so well.

Misconception #2 “I’m not a freak or something.”

It’s quite unfortunate mental training has somehow become linked or synonymous with being “Mentally Weak,” such as being unable to endure tough times or control your emotions. We know this is a totally False assumption as many elite athletes, tremendously mentally strong, still use sports psychologists on a regular basis.

The issue should be viewed from a different perspective. Why do MLB players continuously work with a batting coach? Their game is not “weak or broken”. They work with a coach so they can continue to improve, and to maintain their competitive edge.

So be it with athletes who seek the services of sport psychologists, as they want to improve their mental skills.

Misconception #3 “I don’t get nervous- I’m mentally prepared at all times.”

There are athletes that actually don’t get nervous, but that’s no indication they are mentally prepared. They may suffer in other areas such as dwelling on mistakes, lack of focus, playing as well as they practiced, frustration issues, and being “prepared.”

So nerves aren’t an issue, big deal. Most players who say they are mentally prepared, if asked, could not list the steps they take before a game or any specific techniques they use. While an athlete may think he’s prepared, they often have no concrete plan to deal with both positive and negative events.

Misconception #4 “I already talk to my kid about thinking positively, why do I need someone else to do that?

Congratulation for realizing the importance of positive thinking, and it is imperative that parents are involved in helping formulate the techniques their kids are learning, however kids often tend to listen to other people before their parent. Sorry, but that’s the way of the world.

To sooth your feelings one must remember there is no single technique or modality that works well for all athletes, for all issues. Just as the field of medicine has various specialties to address various issues that patients present, sports psychology is similar. A professional has learned an array of interventions that can be customized to adapt to the wide variety of psychological issues athletes face at every level of the game.

The car navigation system is firstly invited for the military utilization many years ago, but now, it enters the lives of ordinary people. People welcome this device and rely on it for it brings convenience and relief to them.

People used to utilize map to guide the direction, which is hard to understand and easy to make a mistake. Stopping the car on the busy road to find a place is very dangerous. People also download the road information from search engine; However the information is inconvenient and may lack of updating. This device makes the difficult things possible. People who do not have a sense of direction, especially women can have the opportunity to drive cars. They even can go to some places that had never been before, because people trust it. The device is also suitable for the people who are traveling by their own cars. A car navigation system can tell where the gas station or restaurant is. People will feel like there is a professional tourist guide with them.

Some systems are with touch screen that people can easily handle the device and focus their attention on driving. The advanced one also can be managed through voice.

The entire device is consulted by two parts; One is the global position system receiver installed in the car, which display the direction to the user, the other part is composed of computer control center. These two parts connected with each other by the positioning satellites. The computer control center is authorized by the vehicle management which is response to monitor the automation dynamic and transportation situation of the specified district. Therefore, car navigation will have two main functions; One is the tracing function: The position of the car can be pointed out through the computer control center if the coded global position system receiver is well installed. When the user is getting lost, he contacts the central controller which will search car through code. The lost car can be found within 9 seconds. The other function is guidance: The electric drawings of different places are stored in the floppy disc. The transportation situation on the road or the route to the destination will be displayed on the screen.

All in all, the car navigation system is becoming an indispensable part of a car. It will play a greater role in the lives of different people. The related integrated circuit for car navigation is DS18B20 .

It seems that just a few years ago air bags in cars were exotic. Today their numbers in some models are up to 10. At first glance, it's a simple bag made of a smooth, elastic synthetic material. But the fabric of an airbag needs to be very thin and strong, during the accident even the smallest stitches should not be in the contact area with driver's face and body. The cushion should'nt also become a trampoline, it must be able to ease the gas pressure on time. In fact, many manufacturers have ceased to sew pillows using the fabric that breaks letting the air out. When packing the airbag they put talc powder to prolong the cushion's life.

The number of embedded airbags is growing rapidly. Just recently, the car with two airbags was considered as luxury and now having a dozen would not surprise an average driver. All bags are working on one principle but there are many differences. For example, the driver cushions size ranges between 60 and 80 liters, while the passenger needs a much larger 130-160 liter volume. Side impacts are often no less dangerous than frontal. Naturally, the car manufacturers could not leave it unnoticed over the past 10 years and many cars got their side-bags. They are much smaller than the frontal airbags, the volume ranges from 15 to 25 liters. There are also different forms. Along with the conventional «mushroom type», there are extended "long rollers" in the shoulder area and "inflatable shutters", reliably protecting the head of the driver and rear passengers. You can more and more often find a pillow under the front panel or on the floor as the driver's and passenger's legs should also be protected. And the last invention of "Toyoda Gosei" are the cushions to protect pedestrians. Two rollers fired from the radiator grills and a slit between the hood and windshield, designed to minimize the damage of adult pedestrians, and children.Airbags for motorcycles and scooters are already installed by the customer's request, in Italy for example, sometimes soon, and They will be widely used on all motorcycles.

Interesting facts about airbags:

* Passenger airbag is usually twice as big as the driver's due to the greater distance from the dashboard.
* To make an airbag fill up the full volume it needs 25-50 m / sec (for a comparison an eye blink takes about 100 m / sec)
* Airbag opening speed reaches 320 km / h
* "Renault" car maker installs a small airbag in frontal part of the driver's seat to prevent diving under the seat belt.

A gaming computer, gaming rig or gaming PC is specifically designed for playing demanding and complex video games. They are quite similar to regularly conventional personal computers; Specific differences include the inclusion of components that are performance-oriented towards playing games, and video cards. The term 'enthusiast computing' is often used in association with gaming computers as there is overlap of interest and the genres described.

However, for a layman to understand differences between gaming and enthusiast PCs, it is important to know that gaming PCs are put together to achieve specific performance outlays in actual video game play while an enthusiast PC is simply built to maximize and optimize performance using gaming as A benchmark to achieve it. The cost of the two systems also amplifies the differences between the two; While gaming PCs can be extended over a wide range from low, mid and high range segments, enthusiasts are always high-end in definition and are quite expensive.

There is the popular myth or misconception that computer gaming is intertwined with expensive enthusiast computing; However, it is interesting to note that gaming video card manufacturers earn maximum revenues through their low and medium range PC offerings.

Gaming computers are broadly different because of the complex variety of parts that go into assembling them; They are invariably custom assembled than pre-manufactured. Most gaming or hardware enthusiasts put together the computers; Some companies that specialize in manufacturing gaming machines also do this. They create an interest among computer enthusiasts by producing 'boutique' models that allow the enthusiasts themselves to complete the design by aesthetic choice in conjunction with the hardware in the machine.

Although gaming computers are distinctly different from conventional PCs, the evolution for better output began with improving graphics, color fidelity, display systems etc. In producing them for the mass market. Another particular move that has since been integrated into motherboards is the adoption of the sound card which is an all-visible component in today's PCs.

Gaming movements began aggressively in the 1980s with several non-IBM PCs gaining popularity due to advanced sound and graphical capabilities. At that time, game developers, in particular, video game manufacturers and developers started out on these platforms before porting the usage to more common PCs and other platforms such as Apple.

Custom-built gaming computers became increasingly popular in 2012 allowing more flexibility in budgets, controls and upgrading advantages. Some basic components that are required when assembling a gaming computer like motherboard, memory cards, video cards, solid-state drives, CPUs etc. Are optimized for performance outputs by gaming enthusiasts by turning to independent benchmarks during hardware selection. Such benchmarks include ratings for PC components to ensure protection of equipment and safety from in-built hazards like heat output etc.

ABSTRACT

High-speed Internet seamless access is the expectation of recent technology trends. While many of the technologies like High Speed Internet Access (HSPA), Wireless Interoperability for Microwave Access (WiMAX) & Long Term evolution (LTE) are promising and meeting the expectations appropriately, ‘Digital Divide’ still exists when penetrating to the rural areas in a seamless and the cost effective way.

The solution to the above situation is having a way of channeling the broadband internet on the electricity supply so that networking is carried out on power mains. Distribution of internet data on the power lines is called as HomePlug or Broadband over Power lines (BPL).

Electric Broadband!, is an innovation in the recent technology trends. This technology is certainly encouraging and infrastructure cost effective model to offer broadband at high speed internet access – having penetration even into the rural areas since every home in the world is served by power lines.

INTRODUCTION

Realizing how the Communications landscape is changing rapidly since the inception of Internet, Broadband Internet, as known to everyone, is a data transmission mechanism over high bandwidth channels through cables or over the air. Wireline broadband is called Asymmetric Digital Subscriber Line (ADSL) and Wireless Broadband technologies emerging are Mobile WiMAX and Advanced LTE. However, all these technologies require much infrastructure costs to cater the needs of the general public. Hence they are mostly limited to the urban areas and the digital divide is prevailing still by internet not reaching to the masses even at rural geographies.

WHAT IS ELECTRIC BROADBAND?

On the contrary to the technology barriers, new innovative technology called ‘Electric Broadband’ is on the way to reach even the rural areas with NO much infrastructure costs to carry the Internet data over relatively medium/high frequency electric signals. Usually Broadband uses low-frequency electric signals to carry ordinary phone calls and higher-frequency signals to carry Internet data as we see in the ADSL technology. Electronic filters separate the two kinds of signal, with the low frequencies going to your telephone and the higher frequencies to your Internet modem. The principle behind Electric Broadband technology is fairly simple – because electricity routes over just the low-frequency portions of power lines, data packets can be streamed over higher frequencies.

HOW ELECTRIC BROADBAND WORKS?

Key technical concept for the data transmission of the Electric Broadband technology is devised on the fundamental concepts of Radio Frequency (RF) energy bundled on the same line that carries electric current. Since the RF and electricity vibrate on different frequencies, there will be no interference between the two and also the packets transmitted over RF are not lost due to the electrical current. Electric Broadband system consumes only a part of the complete power grid. Usually electricity power generating plants carries to transmit power to substations which then distribute the current using high-voltage transmission lines of 155K to 765K volts and these are not relevant for packet or RF transmission. Solution for the Electric Broadband technology is to bypass the substations and high-voltage wires and concentrate on the medium-voltage transmission lines which typically carries around 7,200 volts and then the transformers convert the electrical current to 240 volts – where the electrical current supplied to the households. Putting in simpler words, standard fiber optic lines are specifically designed for Internet transmissions are going to be used to carry data. These fiber optic lines will be connected to medium-voltage lines. Repeaters are installed at these junction points to repeat the data and boost the strength of the transmission. Couplers or specialized devices are also going to be installed at the transformers to provide a data link around these. After that, the digital data will be carried down the 240-volt line that connects to the residential or office building’s electrical outlets which become the final distribution point for the data.

At this juncture, the residents and the enterprises have two options for Internet connectivity. They can get wireless transmitters that will wirelessly receive the signal and send the data on to computer stations or they can get Broadband over Power Lines modems for data filtering -the Electric Broadband will screen out power line noise and let only data through – then send the data onwards to the stations. The wireless transmitter or the Electric Broadband modem can transmit the signal to end-users or computer stations wirelessly (which may require WLAN-capable devices) or through wires (which require computers connected to the data transmitter or Electric Broadband over modem Ethernet cables.

TECHNOLOGY BENEFITS & BUSINESS CASE

Electricity being the widely spread across the global landscape including the rural areas, electric broadband is going to be a penetrating technology to reach the rural areas and breaking the digital divide in the communication space.

Many benefits can be foreseen by the deployment of this technology. It is affordable because, it uses existing electrical wiring and outlets to avoid expensive data cabling pulls-save up to 75% of the infrastructure spend. It is very convenient for the end-users since every electric outlet in every room becomes Internet-enabled. Very easy to use as no software is necessary, simply “plug and play.” Technology is reliable unlike wireless solutions that suffer from hit-and-miss service coverage and moreover provides the solutions for universal coverage operating a data transmission speeds of up to 6 million bits per second connectivity.

One of the best business cases will be – Power Grid Management Solution which will become very effective after realizing this Electric Broadband technology. Utilities are able to manage their systems better by having the data streamed to them on the power lines. Because this has such a benefit relating directly to the management of electricity there remains a high likelihood of electric utilities investing more money into Electric Broadband. Being able to monitor the electricity grid over the power grid network will create a virtual workforce with many less man hours needed.

TECHNOLOGY CHALLENGES

While this technology has many advantages, there are some challenges as well. RF Interference is the most serious challenge that this technology is currently impacted with. It is facing opposition from ham operators (Amateur Radio) and the Federal Emergency Management Administration (FEMA) who are concerned that Electric Broadband technology will reduce the number of radio frequencies available for ham and short-wave radio operators and that RF transmission over unshielded medium-voltage lines will cause interference with already-assigned frequencies. One another challenge is the considerable delays happening in the technology standards ratification. Transmission standards for Electric Broadband technology is emerging and yet to see draft versions released. This is further hampering efforts to have the technology adapted by more Service Providers.

CONCLUSION

On a final note, Electric Broadband is at least 2 years away from now. However, from the Google research in vendors involved, Electric Broadband is already happening to the tune of about $10 million annually. Since the technology serves a much larger audience than any of its competing technologies. With that kind of potential, it should be able to sustain a growth rate of two to three times that of either cable or telephone companies.

Having your handheld device repaired can be better and more cost effective than having to pay your mobile phone insurance deductible. If you take a look at the price of a mobile device out of contract you’ll see that these little devices aren’t cheap at all, and in fact many of them cost over $500 USD! That isn’t the price that you paid for your phone though, is it? That is because the stores that sell mobile phones get paid for every contract that they sign you up for, and if you terminate your contract early they’ll still get their money. handheld device insurance can be a good investment if you lose your device, but with sites like eBay, Craigslist, and amazon it can be cheaper to get a new handheld device than making an insurance claim. So, is cell phone repair worth it?

Did you break you digitizer by dropping your phone? Many people every day drop their phones on the pavement, in the toilet, and places that are way beyond me. Now, to get your digitizer repaired by a cell phone repair specialist this will cost you under $120 USD on an iPhone 4. The cost to have your insurance replace your iPhone with a refurbished device is $180 for an iPhone 4 if you have your insurance through Assurion. This does not include your monthly deductible that you have been paying every month through your carrier, and they make money off of that too. I’ve found that the average monthly premium price is around $10 USD even for your iPhone 4. A little bit of elementary math will show you that having your digitizer replaced by a mobile device repair specialist is cheaper, and a broken screen is something that Apple will not cover under their warranty.

I know you must be thinking that having mobile device insurance is a rip-off, and it can be depending on your view of the situation. These phone insurance companies buy broken phones in bulk, and repair them. Then they ship one of those mobile phones to you. I have taken apart a couple of those refurbished phones, and some are missing screws, show signs of water damage, and the list could go on and on. If you lost your cell phone you could get a used one off of one of the sites stated above for about the same price as your deductible. With handheld device repair becoming more prominent in larger cities, it will become easier for you to find a cell phone repair center near you. I’ve seen a lot of these device repair agencies pop up over night it seems, and you may want to be careful about who you choose.

I would do my research before I need this type of service, so that way you aren’t scrambling to make a decision on a company. I would follow these rules when choosing a mobile device repair company: Do they list their prices on their website? If they do they are more likely not to change their pricing on a regular basis, and they should know their market. Do they offer a warranty? Most of the handheld device repair companies that I have found offer a warranty of at least 90 days. Do they have parts in stock? Any one of these companies that has been around for a while is going to have parts in stock for the more popular phones that they service, because no one wants to wait. Do they take mail in phones? The strongest of these companies are ready to accept phones from anywhere in the world. Most of the time they can get your phone back to you in less time than your insurance can.

We can all hope that we never drop our phones, run them over, or take them for a swim. Honestly the chances of this happening are greater than you finding $5 dollars on the sidewalk. We all may need a great cell phone repair service one day, but we all don’t need cell phone insurance. It is a great waste of money, and although it is only $10 a month. Ten dollars a month over a year is the same price of having your screen replaced, and if your a habitual cell phone fumble and miss offender, get an Otter Box!