Search results for: graphical user interference
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2906

Search results for: graphical user interference

2006 Study and Analysis of Optical Intersatellite Links

Authors: Boudene Maamar, Xu Mai

Abstract:

Optical Intersatellite Links (OISLs) are wireless communications using optical signals to interconnect satellites. It is expected to be the next generation wireless communication technology according to its inherent characteristics like: an increased bandwidth, a high data rate, a data transmission security, an immunity to interference, and an unregulated spectrum etc. Optical space links are the best choice for the classical communication schemes due to its distinctive properties; high frequency, small antenna diameter and lowest transmitted power, which are critical factors to define a space communication. This paper discusses the development of free space technology and analyses the parameters and factors to establish a reliable intersatellite links using an optical signal to exchange data between satellites.

Keywords: optical intersatellite links, optical wireless communications, free space optical communications, next generation wireless communication

Procedia PDF Downloads 437
2005 Experimenting with Error Performance of Systems Employing Pulse Shaping Filters on a Software-Defined-Radio Platform

Authors: Chia-Yu Yao

Abstract:

This paper presents experimental results on testing the symbol-error-rate (SER) performance of quadrature amplitude modulation (QAM) systems employing symmetric pulse-shaping square-root (SR) filters designed by minimizing the roughness function and by minimizing the peak-to-average power ratio (PAR). The device used in the experiments is the 'bladeRF' software-defined-radio platform. PAR is a well-known measurement, whereas the roughness function is a concept for measuring the jitter-induced interference. The experimental results show that the system employing minimum-roughness pulse-shaping SR filters outperforms the system employing minimum-PAR pulse-shaping SR filters in the sense of SER performance.

Keywords: pulse-shaping filters, FIR filters, jittering, QAM

Procedia PDF Downloads 330
2004 Enhanced Model for Risk-Based Assessment of Employee Security with Bring Your Own Device Using Cyber Hygiene

Authors: Saidu I. R., Shittu S. S.

Abstract:

As the trend of personal devices accessing corporate data continues to rise through Bring Your Own Device (BYOD) practices, organizations recognize the potential cost reduction and productivity gains. However, the associated security risks pose a significant threat to these benefits. Often, organizations adopt BYOD environments without fully considering the vulnerabilities introduced by human factors in this context. This study presents an enhanced assessment model that evaluates the security posture of employees in BYOD environments using cyber hygiene principles. The framework assesses users' adherence to best practices and guidelines for maintaining a secure computing environment, employing scales and the Euclidean distance formula. By utilizing this algorithm, the study measures the distance between users' security practices and the organization's optimal security policies. To facilitate user evaluation, a simple and intuitive interface for automated assessment is developed. To validate the effectiveness of the proposed framework, design science research methods are employed, and empirical assessments are conducted using five artifacts to analyze user suitability in BYOD environments. By addressing the human factor vulnerabilities through the assessment of cyber hygiene practices, this study aims to enhance the overall security of BYOD environments and enable organizations to leverage the advantages of this evolving trend while mitigating potential risks.

Keywords: security, BYOD, vulnerability, risk, cyber hygiene

Procedia PDF Downloads 61
2003 Importance of Mathematical Modeling in Teaching Mathematics

Authors: Selahattin Gultekin

Abstract:

Today, in engineering departments, mathematics courses such as calculus, linear algebra and differential equations are generally taught by mathematicians. Therefore, during mathematicians’ classroom teaching there are few or no applications of the concepts to real world problems at all. Most of the times, students do not know whether the concepts or rules taught in these courses will be used extensively in their majors or not. This situation holds true of for all engineering and science disciplines. The general trend toward these mathematic courses is not good. The real-life application of mathematics will be appreciated by students when mathematical modeling of real-world problems are tackled. So, students do not like abstract mathematics, rather they prefer a solid application of the concepts to our daily life problems. The author highly recommends that mathematical modeling is to be taught starting in high schools all over the world In this paper, some mathematical concepts such as limit, derivative, integral, Taylor Series, differential equations and mean-value-theorem are chosen and their applications with graphical representations to real problems are emphasized.

Keywords: applied mathematics, engineering mathematics, mathematical concepts, mathematical modeling

Procedia PDF Downloads 308
2002 Tetracycline as Chemosensor for Simultaneous Recognition of Al³⁺: Application to Bio-Imaging for Living Cells

Authors: Jesus Alfredo Ortega Granados, Pandiyan Thangarasu

Abstract:

Antibiotic tetracycline presents as a micro-contaminant in fresh water, wastewater and soils, causing environmental and health problems. In this work, tetracycline (TC) has been employed as chemo-sensor for the recognition of Al³⁺ without interring other ions, and the results show that it enhances the fluorescence intensity for Al³⁺ and there is no interference from other coexisting cation ions (Cd²⁺, Ni²⁺, Co²⁺, Sr²⁺, Mg²⁺, Fe³⁺, K⁺, Sm³⁺, Ag⁺, Na⁺, Ba²⁺, Zn²⁺, and Mn²⁺). For the addition of Cu²⁺ to [TET-Al³⁺], it appears that the intensity of fluorescence has been quenched. Other combinations of metal ions in addition to TC do not change the fluorescence behavior. The stoichiometry determined by Job´s plot for the interaction of TC with Al³⁺ was found to be 1:1. Importantly, the detection of Al³⁺⁺ successfully employed in the real samples like living cells, and it was found that TC efficiently performs as a fluorescent probe for Al³⁺ ion in living systems, especially in Saccharomyces cerevisiae; this is confirmed by confocal laser scanning microscopy.

Keywords: chemo-sensor, recognition of Al³⁺ ion, Saccharomyces cerevisiae, tetracycline,

Procedia PDF Downloads 172
2001 Adaptive Filtering in Subbands for Supervised Source Separation

Authors: Bruna Luisa Ramos Prado Vasques, Mariane Rembold Petraglia, Antonio Petraglia

Abstract:

This paper investigates MIMO (Multiple-Input Multiple-Output) adaptive filtering techniques for the application of supervised source separation in the context of convolutive mixtures. From the observation that there is correlation among the signals of the different mixtures, an improvement in the NSAF (Normalized Subband Adaptive Filter) algorithm is proposed in order to accelerate its convergence rate. Simulation results with mixtures of speech signals in reverberant environments show the superior performance of the proposed algorithm with respect to the performances of the NLMS (Normalized Least-Mean-Square) and conventional NSAF, considering both the convergence speed and SIR (Signal-to-Interference Ratio) after convergence.

Keywords: adaptive filtering, multi-rate processing, normalized subband adaptive filter, source separation

Procedia PDF Downloads 420
2000 Single Carrier Frequency Domain Equalization Design to Cope with Narrow Band Jammer

Authors: So-Young Ju, Sung-Mi Jo, Eui-Rim Jeong

Abstract:

In this paper, based on the conventional single carrier frequency domain equalization (SC-FDE) structure, we propose a new SC-FDE structure to cope with narrowband jammer. In the conventional SC-FDE structure, channel estimation is performed in the time domain. When a narrowband jammer exists, time-domain channel estimation is very difficult due to high power jamming interference, which degrades receiver performance. To relieve from this problem, a new SC-FDE frame is proposed to enable channel estimation under narrow band jamming environments. In this paper, we proposed a modified SC-FDE structure that can perform channel estimation in the frequency domain and verified the performance via computer simulation.

Keywords: channel estimation, jammer, pilot, SC-FDE

Procedia PDF Downloads 466
1999 A Parallel Implementation of Artificial Bee Colony Algorithm within CUDA Architecture

Authors: Selcuk Aslan, Dervis Karaboga, Celal Ozturk

Abstract:

Artificial Bee Colony (ABC) algorithm is one of the most successful swarm intelligence based metaheuristics. It has been applied to a number of constrained or unconstrained numerical and combinatorial optimization problems. In this paper, we presented a parallelized version of ABC algorithm by adapting employed and onlooker bee phases to the Compute Unified Device Architecture (CUDA) platform which is a graphical processing unit (GPU) programming environment by NVIDIA. The execution speed and obtained results of the proposed approach and sequential version of ABC algorithm are compared on functions that are typically used as benchmarks for optimization algorithms. Tests on standard benchmark functions with different colony size and number of parameters showed that proposed parallelization approach for ABC algorithm decreases the execution time consumed by the employed and onlooker bee phases in total and achieved similar or better quality of the results compared to the standard sequential implementation of the ABC algorithm.

Keywords: Artificial Bee Colony algorithm, GPU computing, swarm intelligence, parallelization

Procedia PDF Downloads 364
1998 An Absolute Femtosecond Rangefinder for Metrological Support in Coordinate Measurements

Authors: Denis A. Sokolov, Andrey V. Mazurkevich

Abstract:

In the modern world, there is an increasing demand for highly precise measurements in various fields, such as aircraft, shipbuilding, and rocket engineering. This has resulted in the development of appropriate measuring instruments that are capable of measuring the coordinates of objects within a range of up to 100 meters, with an accuracy of up to one micron. The calibration process for such optoelectronic measuring devices (trackers and total stations) involves comparing the measurement results from these devices to a reference measurement based on a linear or spatial basis. The reference used in such measurements could be a reference base or a reference range finder with the capability to measure angle increments (EDM). The base would serve as a set of reference points for this purpose. The concept of the EDM for replicating the unit of measurement has been implemented on a mobile platform, which allows for angular changes in the direction of laser radiation in two planes. To determine the distance to an object, a high-precision interferometer with its own design is employed. The laser radiation travels to the corner reflectors, which form a spatial reference with precisely known positions. When the femtosecond pulses from the reference arm and the measuring arm coincide, an interference signal is created, repeating at the frequency of the laser pulses. The distance between reference points determined by interference signals is calculated in accordance with recommendations from the International Bureau of Weights and Measures for the indirect measurement of time of light passage according to the definition of a meter. This distance is D/2 = c/2nF, approximately 2.5 meters, where c is the speed of light in a vacuum, n is the refractive index of a medium, and F is the frequency of femtosecond pulse repetition. The achieved uncertainty of type A measurement of the distance to reflectors 64 m (N•D/2, where N is an integer) away and spaced apart relative to each other at a distance of 1 m does not exceed 5 microns. The angular uncertainty is calculated theoretically since standard high-precision ring encoders will be used and are not a focus of research in this study. The Type B uncertainty components are not taken into account either, as the components that contribute most do not depend on the selected coordinate measuring method. This technology is being explored in the context of laboratory applications under controlled environmental conditions, where it is possible to achieve an advantage in terms of accuracy. In general, the EDM tests showed high accuracy, and theoretical calculations and experimental studies on an EDM prototype have shown that the uncertainty type A of distance measurements to reflectors can be less than 1 micrometer. The results of this research will be utilized to develop a highly accurate mobile absolute range finder designed for the calibration of high-precision laser trackers and laser rangefinders, as well as other equipment, using a 64 meter laboratory comparator as a reference.

Keywords: femtosecond laser, pulse correlation, interferometer, laser absolute range finder, coordinate measurement

Procedia PDF Downloads 47
1997 The Role of Libraries in the Context of Indian Knowledge Based Society

Authors: Sanjeev Sharma

Abstract:

We are living in the information age. Information is not only important to an individual but also to researchers, scientists, academicians and all others who are doing work in their respective fields. The 21st century which is also known as the electronic era has brought several changes in the mechanism of the libraries in their working environment. In the present scenario, acquisition of information resources and implementation of new strategies have brought a revolution in the library’s structures and their principles. In the digital era, the role of the library has become important as new information is coming at every minute. The knowledge society wants to seek information at their desk. The libraries are managing electronic services and web-based information sources constantly in a democratic way. The basic objective of every library is to save the time of user which is based on the quality and user-orientation of services. With the advancement of information communication and technology, the libraries should pay more devotion to the development trends of the information society that would help to adjust their development strategies and information needs of the knowledge society. The knowledge-based society demands to re-define the position and objectives of all the institutions which work with information, knowledge, and culture. The situation is the era of digital India is changing at a fast speed. Everyone wants information 24x7 and libraries have been recognized as one of the key elements for open access to information, which is crucial not only to individual but also to democratic knowledge-based information society. Libraries are especially important now a day the whole concept of education is focusing more and more independent e-learning and their acting. The citizens of India must be able to find and use the relevant information. Here we can see libraries enter the stage: The essential features of libraries are to acquire, organize, store and retrieve for use and preserve publicly available material irrespective of the print as well as non-print form in which it is packaged in such a way that, when it is needed, it can be found and put to use.

Keywords: knowledge, society, libraries, culture

Procedia PDF Downloads 133
1996 A Statistical Analysis on Relationship between Temperature Variations with Latitude and Altitude regarding Total Amount of Atmospheric Carbon Dioxide in Iran

Authors: Masoumeh Moghbel

Abstract:

Nowadays, carbon dioxide which is produced by human activities is considered as the main effective factor in the global warming occurrence. Regarding to the role of CO2 and its ability in trapping the heat, the main objective of this research is study the effect of atmospheric CO2 (which is recorded in Manaloa) on variations of temperature parameters (daily mean temperature, minimum temperature and maximum temperature) in 5 meteorological stations in Iran which were selected according to the latitude and altitude in 40 years statistical period. Firstly, the trend of temperature parameters was studied by Regression and none-graphical Man-Kendal methods. Then, relation between temperature variations and CO2 were studied by Correlation technique. Also, the impact of CO2 amount on temperature in different atmospheric levels (850 and 500 hpa) was analyzed. The results illustrated that correlation coefficient between temperature variations and CO2 in low latitudes and high altitudes is more significant rather than other regions. it is important to note that altitude as the one of the main geographic factor has limitation in affecting the temperature variations, so that correlation coefficient between these two parameters in 850 hpa (r=0.86) is more significant than 500 hpa (r = 0.62).

Keywords: altitude, atmospheric carbon dioxide, latitude, temperature variations

Procedia PDF Downloads 391
1995 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy

Procedia PDF Downloads 247
1994 A Green Method for Selective Spectrophotometric Determination of Hafnium(IV) with Aqueous Extract of Ficus carica Tree Leaves

Authors: A. Boveiri Monji, H. Yousefnia, M. Haji Hosseini, S. Zolghadri

Abstract:

A clean spectrophotometric method for the determination of hafnium by using a green reagent, acidic extract of Ficus carica tree leaves is developed. In 6-M hydrochloric acid, hafnium reacts with this reagent to form a yellow product. The formed product shows maximum absorbance at 421 nm with a molar absorptivity value of 0.28 × 104 l mol⁻¹ cm⁻¹, and the method was linear in the 2-11 µg ml⁻¹ concentration range. The detection limit value was found to be 0.312 µg ml⁻¹. Except zirconium and iron, the selectivity was good, and most of the ions did not show any significant spectral interference at concentrations up to several hundred times. The proposed method was green, simple, low cost, and selective.

Keywords: spectrophotometric determination, Ficus caricatree leaves, synthetic reagents, hafnium

Procedia PDF Downloads 191
1993 Mathematical Modeling and Analysis of Forced Vibrations in Micro-Scale Microstretch Thermoelastic Simply Supported Beam

Authors: Geeta Partap, Nitika Chugh

Abstract:

The present paper deals with the flexural vibrations of homogeneous, isotropic, generalized micropolar microstretch thermoelastic thin Euler-Bernoulli beam resonators, due to Exponential time varying load. Both the axial ends of the beam are assumed to be at simply supported conditions. The governing equations have been solved analytically by using Laplace transforms technique twice with respect to time and space variables respectively. The inversion of Laplace transform in time domain has been performed by using the calculus of residues to obtain deflection.The analytical results have been numerically analyzed with the help of MATLAB software for magnesium like material. The graphical representations and interpretations have been discussed for Deflection of beam under Simply Supported boundary condition and for distinct considered values of time and space as well. The obtained results are easy to implement for engineering analysis and designs of resonators (sensors), modulators, actuators.

Keywords: microstretch, deflection, exponential load, Laplace transforms, residue theorem, simply supported

Procedia PDF Downloads 299
1992 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents

Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei

Abstract:

With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.

Keywords: document processing, framework, formal definition, machine learning

Procedia PDF Downloads 201
1991 Social Media Resignation the Only Way to Protect User Data and Restore Cognitive Balance, a Literature Review

Authors: Rajarshi Motilal

Abstract:

The birth of the Internet and the rise of social media marked an important chapter in the history of humankind. Often termed the fourth scientific revolution, the Internet has changed human lives and cognisance. The birth of Web 2.0, followed by the launch of social media and social networking sites, added another milestone to these technological advancements where connectivity and influx of information became dominant. With billions of individuals using the internet and social media sites in the 21st century, “users” became “consumers”, and orthodox marketing reshaped itself to digital marketing. Furthermore, organisations started using sophisticated algorithms to predict consumer purchase behaviour and manipulate it to sustain themselves in such a competitive environment. The rampant storage and analysis of individual data became the new normal, raising many questions about data privacy. The excessive usage of the Internet among individuals brought in other problems of them becoming addicted to it, scavenging for societal approval and instant gratification, subsequently leading to a collective dualism, isolation, and finally, depression. This study aims to determine the relationship between social media usage in the modern age and the rise of psychological and cognitive imbalances in human minds. The literature review is positioned timely as an addition to the existing work at a time when the world is constantly debating on whether social media resignation is the only way to protect user data and restore the decaying cognitive balance.

Keywords: social media, digital marketing, consumer behaviour, internet addiction, data privacy

Procedia PDF Downloads 65
1990 User-Centered Design in the Development of Patient Decision Aids

Authors: Ariane Plaisance, Holly O. Witteman, Patrick Michel Archambault

Abstract:

Upon admission to an intensive care unit (ICU), all patients should discuss their wishes concerning life-sustaining interventions (e.g., cardiopulmonary resuscitation (CPR)). Without such discussions, interventions that prolong life at the cost of decreasing its quality may be used without appropriate guidance from patients. We employed user-centered design to adapt an existing decision aid (DA) about CPR to create a novel wiki-based DA adapted to the context of a single ICU and tailored to individual patient’s risk factors. During Phase 1, we conducted three weeks of ethnography of the decision-making context in our ICU to identify clinician and patient needs for a decision aid. During this time, we observed five dyads of intensivists and patients discussing their wishes concerning life-sustaining interventions. We also conducted semi-structured interviews with the attending intensivists in this ICU. During Phase 2, we conducted three rounds of rapid prototyping involving 15 patients and 11 other allied health professionals. We recorded discussions between intensivists and patients and used a standardized observation grid to collect patients’ comments and sociodemographic data. We applied content analysis to field notes, verbatim transcripts and the completed observation grids. Each round of observations and rapid prototyping iteratively informed the design of the next prototype. We also used the programming architecture of a wiki platform to embed the GO-FAR prediction rule programming code that we linked to a risk graphics software to better illustrate outcome risks calculated. During Phase I, we identified the need to add a section in our DA concerning invasive mechanical ventilation in addition to CPR because both life-sustaining interventions were often discussed together by physicians. During Phase II, we produced a context-adapted decision aid about CPR and mechanical ventilation that includes a values clarification section, questions about the patient’s functional autonomy prior to admission to the ICU and the functional decline that they would judge acceptable upon hospital discharge, risks and benefits of CPR and invasive mechanical ventilation, population-level statistics about CPR, a synthesis section to help patients come to a final decision and an online calculator based on the GO-FAR prediction rule. Even though the three rounds of rapid prototyping led to simplifying the information in our DA, 60% (n= 3/5) of the patients involved in the last cycle still did not understand the purpose of the DA. We also identified gaps in the discussion and documentation of patients’ preferences concerning life-sustaining interventions (e.g.,. CPR, invasive mechanical ventilation). The final version of our DA and our online wiki-based GO-FAR risk calculator using the IconArray.com risk graphics software are available online at www.wikidecision.org and are ready to be adapted to other contexts. Our results inform producers of decision aids on the use of wikis and user-centered design to develop DAs that are better adapted to users’ needs. Further work is needed on the creation of a video version of our DA. Physicians will also need the training to use our DA and to develop shared decision-making skills about goals of care.

Keywords: ethnography, intensive care units, life-sustaining therapies, user-centered design

Procedia PDF Downloads 341
1989 Analysis of Urban Rail Transit Station's Accessibility Reliability: A Case Study of Hangzhou Metro, China

Authors: Jin-Qu Chen, Jie Liu, Yong Yin, Zi-Qi Ju, Yu-Yao Wu

Abstract:

Increase in travel fare and station’s failure will have huge impact on passengers’ travel. The Urban Rail Transit (URT) station’s accessibility reliability under increasing travel fare and station failure are analyzed in this paper. Firstly, the passenger’s travel path is resumed based on stochastic user equilibrium and Automatic Fare Collection (AFC) data. Secondly, calculating station’s importance by combining LeaderRank algorithm and Ratio of Station Affected Passenger Volume (RSAPV), and then the station’s accessibility evaluation indicators are proposed based on the analysis of passenger’s travel characteristic. Thirdly, station’s accessibility under different scenarios are measured and rate of accessibility change is proposed as station’s accessibility reliability indicator. Finally, the accessibility of Hangzhou metro stations is analyzed by the formulated models. The result shows that Jinjiang station and Liangzhu station are the most important and convenient station in the Hangzhou metro, respectively. Station failure and increase in travel fare and station failure have huge impact on station’s accessibility, except for increase in travel fare. Stations in Hangzhou metro Line 1 have relatively worse accessibility reliability and Fengqi Road station’s accessibility reliability is weakest. For Hangzhou metro operational department, constructing new metro line around Line 1 and protecting Line 1’s station preferentially can effective improve the accessibility reliability of Hangzhou metro.

Keywords: automatic fare collection data, AFC, station’s accessibility reliability, stochastic user equilibrium, urban rail transit, URT

Procedia PDF Downloads 121
1988 Visualization of Energy Waves via Airy Functions in Time-Domain

Authors: E. Sener, O. Isik, E. Eroglu, U. Sahin

Abstract:

The main idea is to solve the system of Maxwell’s equations in accordance with the causality principle to get the energy quantities via Airy functions in a hollow rectangular waveguide. We used the evolutionary approach to electromagnetics that is an analytical time-domain method. The boundary-value problem for the system of Maxwell’s equations is reformulated in transverse and longitudinal coordinates. A self-adjoint operator is obtained and the complete set of Eigen vectors of the operator initiates an orthonormal basis of the solution space. Hence, the sought electromagnetic field can be presented in terms of this basis. Within the presentation, the scalar coefficients are governed by Klein-Gordon equation. Ultimately, in this study, time-domain waveguide problem is solved analytically in accordance with the causality principle. Moreover, the graphical results are visualized for the case when the energy and surplus of the energy for the time-domain waveguide modes are represented via airy functions.

Keywords: airy functions, Klein-Gordon Equation, Maxwell’s equations, Surplus of energy, wave boundary operators

Procedia PDF Downloads 354
1987 Signal Restoration Using Neural Network Based Equalizer for Nonlinear channels

Authors: Z. Zerdoumi, D. Benatia, , D. Chicouche

Abstract:

This paper investigates the application of artificial neural network to the problem of nonlinear channel equalization. The difficulties caused by channel distortions such as inter symbol interference (ISI) and nonlinearity can overcome by nonlinear equalizers employing neural networks. It has been shown that multilayer perceptron based equalizer outperform significantly linear equalizers. We present a multilayer perceptron based equalizer with decision feedback (MLP-DFE) trained with the back propagation algorithm. The capacity of the MLP-DFE to deal with nonlinear channels is evaluated. From simulation results it can be noted that the MLP based DFE improves significantly the restored signal quality, the steady state mean square error (MSE), and minimum Bit Error Rate (BER), when comparing with its conventional counterpart.

Keywords: Artificial Neural Network, signal restoration, Nonlinear Channel equalization, equalization

Procedia PDF Downloads 486
1986 A Hybrid MAC Protocol for Delay Constrained Mobile Wireless Sensor Networks

Authors: Hanefi Cinar, Musa Cibuk, Ismail Erturk, Fikri Aggun, Munip Geylani

Abstract:

Mobile Wireless Sensor Networks (MWSNs) carry heterogeneous data traffic with different urgency and quality of service (QoS) requirements. There are a lot of studies made on energy efficiency, bandwidth, and communication methods in literature. But delay, high throughput, utility parameters are not well considered. Increasing demand for real-time data transfer makes these parameters more important. In this paper we design new MAC protocol which is delay constrained and targets for improving delay, utility, and throughput performance of the network and finding solutions on collision and interference problems. Protocol improving QoS requirements by using TDMA, FDM, and OFDMA hybrid communication methods with multi-channel communication.

Keywords: MWSN, delay, hybrid MAC, TDMA, FDM, OFDMA

Procedia PDF Downloads 469
1985 Multi-Band, Polarization Insensitive, Wide Angle Receptive Metamaterial Absorber for Microwave Applications

Authors: Lincy Stephen, N. Yogesh, G. Vasantharajan, V. Subramanian

Abstract:

This paper presents the design and simulation of a five band metamaterial absorber at microwave frequencies. The absorber unit cell consists of squares and strips arranged as the top layer and a metallic ground plane as the bottom layer on a dielectric substrate. Simulation results show five near perfect absorption bands at 3.15 GHz, 7.15 GHz, 11.12 GHz, 13.87 GHz, and 16.85 GHz with absorption magnitudes 99.68%, 99.05%, 96.98%, 98.36% and 99.44% respectively. Further, the proposed absorber exhibits polarization insensitivity and wide angle receptivity. The surface current analysis is presented to explain the mechanism of absorption in the structure. With these preferable features, the proposed absorber can be excellent choice for potential applications such as electromagnetic interference (EMI) shielding, radar cross section reduction.

Keywords: electromagnetic absorber, metamaterial, multi- band, polarization insensitive, wide angle receptive

Procedia PDF Downloads 332
1984 Calibration of a Large Standard Step Height with Low Sampled Coherence Scanning Interferometry

Authors: Dahi Ghareab Abdelsalam Ibrahim

Abstract:

Scanning interferometry is commonly used for measuring the three-dimensional profiling of surfaces. Here, we used a scanning stage calibrated with standard gauge blocks to measure a standard step height of 200μm. The stage measures precisely the envelope of interference at the platen and at the surface of the step height. From the difference between the two envelopes, we measured the step height of the sample. Experimental measurements show that the measured value matches well with the nominal value of the step height. A light beam of 532nm from a Tungsten Lamp is collimated and incident on the interferometer. By scanning, two envelopes were produced. The envelope at the platen surface and the envelope at the object surface were determined precisely by a written program code, and then the difference between them was measured from the calibrated scanning stage. The difference was estimated to be in the range of 198 ± 2 μm.

Keywords: optical metrology, digital holography, interferometry, phase unwrapping

Procedia PDF Downloads 60
1983 Voltage Profile Enhancement in the Unbalanced Distribution Systems during Fault Conditions

Authors: K. Jithendra Gowd, Ch. Sai Babu, S. Sivanagaraju

Abstract:

Electric power systems are daily exposed to service interruption mainly due to faults and human accidental interference. Short circuit currents are responsible for several types of disturbances in power systems. The fault currents are high and the voltages are reduced at the time of fault. This paper presents two suitable methods, consideration of fault resistance and Distributed Generator are implemented and analyzed for the enhancement of voltage profile during fault conditions. Fault resistance is a critical parameter of electric power systems operation due to its stochastic nature. If not considered, this parameter may interfere in fault analysis studies and protection scheme efficiency. The effect of Distributed Generator is also considered. The proposed methods are tested on the IEEE 37 bus test systems and the results are compared.

Keywords: distributed generation, electrical distribution systems, fault resistance

Procedia PDF Downloads 503
1982 A Systematic Snapshot of Software Outsourcing Challenges

Authors: Issam Jebreen, Eman Al-Qbelat

Abstract:

Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.

Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.

Procedia PDF Downloads 79
1981 Perceptions of Tunisian EFL Students toward Their Writing Difficulties

Authors: Salwa Enneifer

Abstract:

The research is intended to investigate Tunisian students’ own perception of the difficulties they encounter in the writing task. To achieve this objective, a questionnaire was administered to students enrolled in the ‘Faculty of Letters Arts and Humanities’ in Kairouan, in Tunisia. Students were classified into three groups: first-, second-, and third-year students. The researcher used 120 questionnaires filled in by the students as data for this study; moreover, 30 students participated in a semi-structured interview to complete the data. The questionnaire results revealed that Tunisian EFL students faced spelling and grammar difficulties. ANOVA also revealed that the first-year students did not recognise that Arabic and English greatly differ in their respective punctuation systems. The second-year class, however, was fully aware of this difference. Additionally, the interview shed light on other aspects or different difficulties experienced by students in writing: a cruel ‘lack of vocabulary’, Arabic language interference, the organisation of the essay and especially the academic essay, and difficulty with writing an argumentative essay.

Keywords: difficulties, writing, Tunisian, EFL students

Procedia PDF Downloads 231
1980 Technology Maps in Energy Applications Based on Patent Trends: A Case Study

Authors: Juan David Sepulveda

Abstract:

This article reflects the current stage of progress in the project “Determining technological trends in energy generation”. At first it was oriented towards finding out those trends by employing such tools as the scientometrics community had proved and accepted as effective for getting reliable results. Because a documented methodological guide for this purpose could not be found, the decision was made to reorient the scope and aim of this project, changing the degree of interest in pursuing the objectives. Therefore it was decided to propose and implement a novel guide from the elements and techniques found in the available literature. This article begins by explaining the elements and considerations taken into account when implementing and applying this methodology, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.

Keywords: energy, technology mapping, patents, univariate analysis

Procedia PDF Downloads 465
1979 Analysis of Fertilizer Effect in the Tilapia Growth of Mozambique (Oreochromis mossambicus)

Authors: Sérgio Afonso Mulema, Andrés Carrión García, Vicente Ernesto

Abstract:

This paper analyses the effect of fertilizer (organic and inorganic) in the growth of tilapia. An experiment was implemented in the Aquapesca Company of Mozambique; there were considered four different treatments. Each type of fertilizer was applied in two of these treatments; a feed was supplied to the third treatment, and the fourth was taken as control. The weight and length of the tilapia were used as the growth parameters, and to measure the water quality, the physical-chemical parameters were registered. The results show that the weight and length were different for tilapias cultivated in different treatments. These differences were evidenced mainly by organic and feed treatments, where there was the largest and smallest value of these parameters, respectively. In order to prove that these differences were caused only by applied treatment without interference for the aquatic environment, a Fisher discriminant analysis was applied, which confirmed that the treatments were exposed to the same environment condition.

Keywords: fertilizer, tilapia, growth, statistical methods

Procedia PDF Downloads 216
1978 Term Creation in Specialized Fields: An Evaluation of Shona Phonetics and Phonology Terminology at Great Zimbabwe University

Authors: Peniah Mabaso-Shamano

Abstract:

The paper evaluates Shona terms that were created to teach Phonetics and Phonology courses at Great Zimbabwe University (GZU). The phonetics and phonology terms to be discussed in this paper were created using different processes and strategies such as translation, borrowing, neologising, compounding, transliteration, circumlocution among many others. Most phonetics and phonology terms are alien to Shona and as a result, there are no suitable Shona equivalents. The lecturers and students for these courses have a mammoth task of creating terminology for the different modules offered in Shona and other Zimbabwean indigenous languages. Most linguistic reference books are written in English. As such, lecturers and students translate information from English to Shona, a measure which is proving to be too difficult for them. A term creation workshop was held at GZU to try to address the problem of lack of terminology in indigenous languages. Different indigenous language practitioners from different tertiary institutions convened for a two-day workshop at GZU. Due to the 'specialized' nature of phonetics and phonology, it was too difficult to come up with 'proper' indigenous terms. The researcher will consult tertiary institutions lecturers who teach linguistics courses and linguistics students to get their views on the created terms. The people consulted will not be the ones who took part in the term creation workshop held at GZU. The selected participants will be asked to evaluate and back-translate some of the terms. In instances where they feel the terms created are not suitable or user-friendly, they will be asked to suggest other terms. Since the researcher is also a linguistics lecturer, her observation and views will be important. From her experience in using some of the terms in teaching phonetics and phonology courses to undergraduate students, the researcher noted that most of the terms created have shortcomings since they are not user-friendly. These shortcomings include terms longer than the English terms as some terms are translated to Shona through a whole statement. Most of these terms are neologisms, compound neologisms, transliterations, circumlocutions, and blends. The paper will show that there is overuse of transliterated terms due to the lack of Shona equivalents for English terms. Most single English words were translated into compound neologisms or phrases after attempts to reduce them to one word terms failed. In other instances, circumlocution led to the problem of creating longer terms than the original and as a result, the terms are not user-friendly. The paper will discuss and evaluate the different phonetics and phonology terms created and the different strategies and processes used in creating them.

Keywords: blending, circumlocution, term creation, translation

Procedia PDF Downloads 136
1977 A Low-Cost Experimental Approach for Teaching Energy Quantization: Determining the Planck Constant with Arduino and Led

Authors: Gastão Soares Ximenes de Oliveira, Richar Nicolás Durán, Romeo Micah Szmoski, Eloiza Aparecida Avila de Matos, Elano Gustavo Rein

Abstract:

This article aims to present an experimental method to determine Planck's constant by calculating the cutting potential V₀ from LEDs with different wavelengths. The experiment is designed using Arduino as a central tool in order to make the experimental activity more engaging and attractive for students with the use of digital technologies. From the characteristic curves of each LED, graphical analysis was used to obtain the cutting potential, and knowing the corresponding wavelength, it was possible to calculate Planck's constant. This constant was also obtained from the linear adjustment of the cutting potential graph by the frequency of each LED. Given the relevance of Planck's constant in physics, it is believed that this experiment can offer teachers the opportunity to approach concepts from modern physics, such as the quantization of energy, in a more accessible and applied way in the classroom. This will not only enrich students' understanding of the fundamental nature of matter but also encourage deeper engagement with the principles of quantum physics.

Keywords: physics teaching, educational technology, modern physics, Planck constant, Arduino

Procedia PDF Downloads 63