9P

What does 9P mean?

9P is a network protocol developed by Bell Labs which serves as a means to connect the components of a Plan 9 system. The Plan 9 system is a distributed OS designed to serve as a platform for research purposes. It represents all system interfaces through the file system. The files are regarded as the key objects and used to represent windows, network connections, processes and user interfaces.

This term is also known as Plan 9 File System Protocol, 9P2000 or Styx.

The 9P protocol provides a means to access and manipulate resources and applications transparently in a distributed environment. It is used for messages between clients and servers. The client transmits requests in the form of T-messages to a server. The server replies in the form of R-messages to the client. This process of transmitting a request and receiving replies is known as a transaction. These messages relate to the entry points and must be implemented by any 9P server.

Continue reading

Advertisements

Sandboxing

What does Sandboxing mean?

Sandboxing is a computer security term referring to when a program is set aside from other programs in a separate environment so that if errors or security issues occur, those issues will not spread to other areas on the computer. Programs are enabled in their own sequestered area, where they can be worked on without posing any threat to other programs.

Sandboxes can look like a regular operating environment, or they can be much more bare bones. Virtual machines are often used for what are referred to as runtime sandboxes.

There are also ways to use sandboxing in applications. For example, questionable code can be used in a safe manner with proof-carrying code. A “proof” must remain in effect to ensure that the code is relatively safe to execute. This bears a very crude likeness to the keys involved in encryption to ensure that a trusted exchange channel is being met.

Continue reading

AMD Virtualization

What does AMD Virtualization (AMD-V) mean?

AMD virtualization (AMD-V) is a virtualization technology developed by Advanced Micro Devices.

AMD-V technology takes some of the tasks that virtual machine managers perform through software emulation and simplifies those tasks through enhancements in the processor’s instruction set.

AMD virtualization technology uses hardware to do the job that virtual machine managers do via software by incorporating virtualization extensions in a processor’s instruction set.

Continue reading

Streams And Iteration In A Single Assignment Language

What does Streams And Iteration In A Single Assignment Language (SISAL) mean?

Streams and Iteration in a Single Assignment Language (SISAL) is a single-assignment functional programming language that features strict semantics, potent array handling and implicit parallelism. SISAL can output a dataflow graph in Intermediary Form 1 (IF1).

The name originated from borrowing “sal” for “Single Assignment Language” from the Unix dictionary “/usr/dict/words.”

In 1983, James McGraw et al. defined the SISAL at the University of Manchester, Colorado State University, Lawrence Livermore National Laboratory (LLNL) and Digital Equipment Corporation (DEC). The first revision was carried out in 1985, and the first compiled deployment was done during 1986. When compared to the performances of C and FORTRAN, SISAL’s performance is top-notch, which features automatic and highly effective parallelization.

Continue reading

Intercloud

What does Intercloud mean?

Intercloud is a term used in IT to refer to a theoretical model for cloud computing services. The idea of the intercloud relies on models that have already been shown to be effective in cases like the global Internet and the 3G and 4G wireless networks of various national telecom providers.

Experts sometimes refer to the intercloud as a cloud of clouds.

The idea behind an intercloud is that a single common functionality would combine many different individual clouds into one seamless mass in terms of on-demand operations. To understand how this works, it’s helpful to think about how existing cloud computing setups are designed.

Continue reading

Predictive Dialer

What does Predictive Dialer mean?

Predictive dialers are outbound call processing systems that are designed to maintain high levels of activity and provide cost efficiency in contact centers. These dialers are capable of calling a list of telephone numbers automatically, screening unnecessary calls such as answering machines and busy signals, and connecting waiting representatives with customers.

This software-based solutions help companies avoid the use of expensive telephony boards and other associated hardware, which have high maintenance costs. Predictive dialers are easy to install and configure, and therefore occupy a significant position in telemarketing, payment collection, service follow-ups, surveys and appointment confirmation.

Continue reading

Electronic Data Capture

What does Electronic Data Capture (EDC) mean?

Electronic data capture (EDC) is the computerized collection and management of clinical trial data from patients and subjects. An EDC system uses technology to streamline the collection and transmission of clinical trial data from the patient to the research laboratory. The process reduces data errors to provide researchers with improved data quality. In addition, it speeds up the entire clinical trial process, thus reducing research costs.

EDC solutions are widely used for clinical trials and research purposes by clinical research organizations, biotechnology and pharmaceutical industries, and also for safety surveillance activities.

Clinical trials data at the source may first be recorded on paper and then entered into the electronic case report form, or be fed directly into the electronic case report form (eCRF). Another entry procedure is the interactive voice response system (IVR), wherein the patient reports information through a telephone or point of contact data collection system. This is known as electronic patient reported outcomes (ePRO), and data is captured using devices such as tablets or digital pens.

Continue reading

Beer and Pretzels

What does Beer and Pretzels mean?

Beer and pretzels is a slang term used to refer to a video game that is considered to be easy in terms of strategy and rules, but is still entertaining. Beer-and-pretzel games are designed to be completed in a short time (less than a day) and generally involve multiple players. These games get their name from the custom of drinking beer and eating pretzels while taking part in such games.

Beer-and-pretzel video games marked a sharp departure from the previous trend where games where getting longer and more immersive as technological capabilities increased. Immersive games still make up the majority of games produced, but beer-and-pretzel games have become a popular subgenre within gaming. Moreover, some immersive games have introduced beer-and-pretzel elements, so they can be enjoyed in groups or by a single player. These elements include time trails, mini-games, kill-count challenges, and so on.

Continue reading

Beehive Forum

What does Beehive Forum mean?

Beehive Forum is an open-source software tool for creating Web forums. It is coded in PHP and uses MySQL database resources. Beehive Forum was, in some ways, based on Delphi programming concepts. It requires PHP 4.1.0 and MySQL 3.5.

One unique aspect of Beehive Forum is how it places the components of a forum on the screen. In Beehive Forum, discussion titles are on the left and posts are on the right. This allows for a more expansive view of individual posts, with their own tags and avatars, etc.

Continue reading

Big Data

What does Big Data mean?

Big data refers to a process that is used when traditional data mining and handling techniques cannot uncover the insights and meaning of the underlying data. Data that is unstructured or time sensitive or simply very large cannot be processed by relational database engines. This type of data requires a different processing approach called big data, which uses massive parallelism on readily-available hardware.

Quite simply, big data reflects the changing world we live in. The more things change, the more the changes are captured and recorded as data. Take weather as an example. For a weather forecaster, the amount of data collected around the world about local conditions is substantial. Logically, it would make sense that local environments dictate regional effects and regional effects dictate global effects, but it could well be the other way around. Continue reading