Search results for: software defined radio
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7842

Search results for: software defined radio

6492 A Unified Approach for Naval Telecommunication Architectures

Authors: Y. Lacroix, J.-F. Malbranque

Abstract:

We present a chronological evolution for naval telecommunication networks. We distinguish periods: with or without multiplexers, with switch systems, with federative systems, with medium switching, and with medium switching with wireless networks. This highlights the introduction of new layers and technology in the architecture. These architectures are presented using layer models of transmission, in a unified way, which enables us to integrate pre-existing models. A ship of a naval fleet has internal communications (i.e. applications' networks of the edge) and external communications (i.e. the use of the means of transmission between edges). We propose architectures, deduced from the layer model, which are the point of convergence between the networks on board and the HF, UHF radio, and satellite resources. This modelling allows to consider end-to-end naval communications, and in a more global way, that is from the user on board towards the user on shore, including transmission and networks on the shore side. The new architectures need take care of quality of services for end-to-end communications, the more remote control develops a lot and will do so in the future. Naval telecommunications will be more and more complex and will use more and more advanced technologies, it will thus be necessary to establish clear global communication schemes to grant consistency of the architectures. Our latest model has been implemented in a military naval situation, and serves as the basic architecture for the RIFAN2 network.

Keywords: equilibrium beach profile, eastern tombolo of Giens, potential function, erosion

Procedia PDF Downloads 287
6491 Attitude of Youth Farmers to Climate Change Adaptation and Mitigation in Benue State, Nigeria

Authors: Cynthia E. Nwobodo, A. E. Agwu

Abstract:

The study was carried out in Benue State, Nigeria. Multi-stage sampling technique was used to select 120 respondents from two agricultural zones in the State. Data was collected using interview schedule. Descriptive statistics was used in data analysis. Findings showed that youth farmers in the area had positive attitude to climate change adaptation and mitigation as shown by their response to a set of positive and negative statement including: the youth are very important stakeholders in climate change issues (M= 2.91), youths should be encouraged to be climate change conscious (2.90), everybody should be involved in planting trees not just the government (M= 2.89), I will be glad to participate in climate change seminars (M= 2.89) among others. Findings on information seeking behavior indicate that majority (80.8 %) of the respondents sought climate change information from radio at an average of 19.78 times per month, 53.3 % sought from friends and neighbours at an average of 12.55 times per month and 42.5 % sought from family members at an average of 12.55 times per month among others. It was recommended that Youth farmers should be made important stakeholders in climate change policies and programmes since they have a very positive attitude to climate change adaptation and mitigation.

Keywords: adaptation, mitigation, attitude, climate change, youth farmers

Procedia PDF Downloads 641
6490 Gender Stereotypes in Reproductive Medicine with Regard to Parental Age

Authors: Monika Michałowska, Anna Alichniewicz

Abstract:

Detrimental outcomes of advanced maternal age on the chances of fertilization, pregnancy as well as mother and fetus health have been recognized for several decades. It seemed interesting to investigate whether there is a comparable awareness of the detrimental influence on the reproductive outcomes of late fatherhood, given that it has been already ten years since an intense and growing interest concerning later-age fatherhood commenced in medical research. To address that issue a two-step research was done. First, we performed a review of the subject literature to answer the following questions: 1) What age is defined as advanced?; 2) Is the same age defined as advanced in both genders?; 3) What terminology concerning age issues is used?; 4) Is the same age terminology used regarding both genders? The second part of our studies was devoted to the views of medical students. This part of our research comprised both quantitative and qualitative studies. Opinions of medical students in one of the Polish medical universities on several issues connected with assisted reproduction technology (ART) were gathered: 1) students’ attitude to in vitro fertilization (IVF) for women over 40 and for postmenopausal women; 2) students’ attitude to late fatherhood; 3) students’ reasoning given against acceptability of IVF procedure for all of these group of patients involved in an IVF procedure. Our analyses revealed that: First, there is no universal definition of the term ‘advanced age’; secondly, there is a general tendency to adopt different age limits depending on whether they refer to maternal or paternal age, but no justification is provided by the researchers explaining why they set different age limits for women and men; thirdly, the image of postponed fatherhood stands in stark contrast to postponed motherhood - while postponed fatherhood is frequently portrayed as a reasonable and conscious decision enabling a stable family environment for a child, the reasonableness of postponed motherhood is often questioned; finally, the bias regarding maternal versus paternal age is deeply embedded in medical students’ attitude to IVF for women over 40 and for postmenopausal women.

Keywords: gender stereotypes, reproductive medicine, maternal age, paternal age

Procedia PDF Downloads 260
6489 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application

Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada

Abstract:

This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.

Keywords: energy policy, energy diversification, “IntelSymb” software, renewable energy

Procedia PDF Downloads 218
6488 Software Tool Design for Heavy Oil Upgrading by Hydrogen Donor Addition in a Hydrodynamic Cavitation Process

Authors: Munoz A. Tatiana, Solano R. Brandon, Montes C. Juan, Cierco G. Javier

Abstract:

The hydrodynamic cavitation is a process in which the energy that the fluids have in the phase changes is used. From this energy, local temperatures greater than 5000 °C are obtained where thermal cracking of the fluid molecules takes place. The process applied to heavy oil affects variables such as viscosity, density, and composition, which constitutes an important improvement in the quality of crude oil. In this study, the need to design a software through mathematical integration models of mixing, cavitation, kinetics, and reactor, allows modeling changes in density, viscosity, and composition of a heavy oil crude, when the fluid passes through a hydrodynamic cavitation reactor. In order to evaluate the viability of this technique in the industry, a heavy oil of 18° API gravity, was simulated using naphtha as a hydrogen donor at concentrations of 1, 2 and 5% vol, where the simulation results showed an API gravity increase to 0.77, 1.21 and 1.93° respectively and a reduction viscosity by 9.9, 12.9 and 15.8%. The obtained results allow to have a favorable panorama on this technological development, an appropriate visualization on the generation of innovative knowledge of this technique and the technical-economic opportunity that benefits the development of the hydrocarbon sector related to heavy crude oil that includes the largest world oil production.

Keywords: hydrodynamic cavitation, thermal cracking, hydrogen donor, heavy oil upgrading, simulator

Procedia PDF Downloads 147
6487 The Development and Provision of a Knowledge Management Ecosystem, Optimized for Genomics

Authors: Matthew I. Bellgard

Abstract:

The field of bioinformatics has made, and continues to make, substantial progress and contributions to life science research and development. However, this paper contends that a systems approach integrates bioinformatics activities for any project in a defined manner. The application of critical control points in this bioinformatics systems approach may be useful to identify and evaluate points in a pathway where specified activity risk can be reduced, monitored and quality enhanced.

Keywords: bioinformatics, food security, personalized medicine, systems approach

Procedia PDF Downloads 418
6486 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river

Procedia PDF Downloads 283
6485 Machine Learning Techniques in Seismic Risk Assessment of Structures

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.

Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine

Procedia PDF Downloads 98
6484 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 375
6483 An Interactive Platform Displaying Mixed Reality Media

Authors: Alfred Chen, Cheng Chieh Hsu, Yu-Pin Ma, Meng-Jie Lin, Fu Pai Chiu, Yi-Yan Sie

Abstract:

This study is attempted to construct a human-computer interactive platform system that has mainly consisted of an augmented hardware system, a software system, a display table, and mixed media. This system has provided with human-computer interaction services through an interactive platform for the tourism industry. A well designed interactive platform, integrating of augmented reality and mixed media, has potential to enhance museum display quality and diversity. Besides, it will create a comprehensive and creative display mode for most museums and historical heritages. Therefore, it is essential to let public understand what the platform is, how it functions, and most importantly how one builds an interactive augmented platform. Hence the authors try to elaborate the construction process of the platform in detail. Thus, there are three issues to be considered, i.e.1) the theory and application of augmented reality, 2) the hardware and software applied, and 3) the mixed media presented. In order to describe how the platform works, Courtesy Door of Tainan Confucius Temple has been selected as case study in this study. As a result, a developed interactive platform has been presented by showing the physical entity object, along with virtual mixing media such as text, images, animation, and video. This platform will result in providing diversified and effective information that will be delivered to the users.

Keywords: human-computer interaction, mixed reality, mixed media, tourism

Procedia PDF Downloads 483
6482 The Future of Insurance: P2P Innovation versus Traditional Business Model

Authors: Ivan Sosa Gomez

Abstract:

Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.

Keywords: Insurtech, innovation, business model, P2P, insurance

Procedia PDF Downloads 87
6481 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 122
6480 Harnessing the Generation of Ferromagnetic and Silver Nanostructures from Tropical Aquatic Microbial Nanofactories

Authors: Patricia Jayshree Jacob, Mas Jaffri Masarudinb, Mohd Zobir Hussein, Raha Abdul Rahim

Abstract:

Iron based ferromagnetic nanoparticles (IONP) and silver nanostructures (AgNP) have found a wide range of application in antimicrobial therapy, cell targeting, and environmental applications. As such, the design of well-defined monodisperse IONPs and AgNPs have become an essential tool in nanotechnology. Fabrication of these nanostructures using conventional methods is not environmentally conducive and weigh heavily on energy and outlays. Selected microorganisms possess the innate ability to reduce metallic ions in colloidal aqueous solution to generate nanoparticles. Hence, harnessing this potential is a way forward in constructing microbial nano-factories, capable of churning out high yields of well-defined IONP’s and AgNP's with physicochemical characteristics on par with the best synthetically produced nanostructures. In this paper, we report the isolation and characterization of bacterial strains isolated from the tropical marine and freshwater ecosystems of Malaysia that demonstrated facile and rapid generation of ferromagnetic nanoparticles and silver nanostructures when precursors such as FeCl₃.6H₂O and AgNO₃ were added to the cell-free bacterial lysate in colloidal solution. Characterization of these nanoparticles was carried out using FESEM, UV Spectrophotometer, XRD, DLS and FTIR. This aerobic bioprocess was carried out at ambient temperature and humidity and has the potential to be developed for environmental friendly, cost effective large scale production of IONP’s. A preliminary bioprocess study on the harvesting time, incubation temperature and pH was also carried out to determine pertinent abiotic parameters contributing to the optimal production of these nanostructures.

Keywords: iron oxide nanoparticles, silver nanoparticles, biosynthesis, aquatic bacteria

Procedia PDF Downloads 279
6479 Implementation of Data Science in Field of Homologation

Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande

Abstract:

For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.

Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)

Procedia PDF Downloads 155
6478 Future Education: Changing Paradigms

Authors: Girish Choudhary

Abstract:

Education is in a state of flux. Not only one need to acquire skills in order to cope with a fast changing global world, an explosive growth in technology, on the other hand is providing a new wave of teaching tools - computer aided video instruction, hypermedia, multimedia, CD-ROMs, Internet connections, and collaborative software environments. The emerging technology incorporates the group qualities of interactive, classroom-based learning while providing individual students the flexibility to participate in an educational programme at their own time and place. The technology facilitating self learning also seems to provide a cost effective solution to the dilemma of delivering education to masses. Online education is a unique learning domain that provides for many to many communications as well. The computer conferencing software defines the boundaries of the virtual classroom. The changing paradigm provides access of instruction to a large proportion of society, promises a qualitative change in the quality of learning and echoes a new way of thinking in educational theory that promotes active learning and open new learning approaches. Putting it to practice is challenging and may fundamentally alter the nature of educational institutions. The subsequent part of paper addresses such questions viz. 'Do we need to radically re-engineer the curriculum and foster an alternate set of skills in students?' in the onward journey.

Keywords: on-line education, self learning, energy and power engineering, future education

Procedia PDF Downloads 324
6477 Some New Hesitant Fuzzy Sets Operator

Authors: G. S. Thakur

Abstract:

In this paper, four new operators (O1, O2, O3, O4) are proposed, defined and considered to study the new properties and identities on hesitant fuzzy sets. These operators are useful for different operation on hesitant fuzzy sets. The various theorems are proved using the new operators. The study of the proposed new operators has opened a new area of research and applications.

Keywords: vague sets, hesitant fuzzy sets, intuitionistic fuzzy set, fuzzy sets, fuzzy multisets

Procedia PDF Downloads 280
6476 Skew Cyclic Codes over Fq+uFq+…+uk-1Fq

Authors: Jing Li, Xiuli Li

Abstract:

This paper studies a special class of linear codes, called skew cyclic codes, over the ring R= Fq+uFq+…+uk-1Fq, where q is a prime power. A Gray map ɸ from R to Fq and a Gray map ɸ' from Rn to Fnq are defined, as well as an automorphism Θ over R. It is proved that the images of skew cyclic codes over R under map ɸ' and Θ are cyclic codes over Fq, and they still keep the dual relation.

Keywords: skew cyclic code, gray map, automorphism, cyclic code

Procedia PDF Downloads 290
6475 Analysis of an Alternative Data Base for the Estimation of Solar Radiation

Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag

Abstract:

The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.

Keywords: energy potential, reanalyses, renewable energy, solar radiation

Procedia PDF Downloads 157
6474 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor

Authors: Jinseon Song, Yongwan Park

Abstract:

In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.

Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation

Procedia PDF Downloads 342
6473 The Impact of Information and Communication Technology on the Re-Engineering Process of Small and Medium Enterprises

Authors: Hiba Mezaache

Abstract:

The current study aimed to know the impact of using information and communication technology on the process of re-engineering small and medium enterprises, as the world witnessed the speed development of the latter in its field of work and the diversity of its objectives and programs, that also made its process important for the growth and development of the institution and also gaining the flexibility to face the changes that may occur in the environment of work, so in order to know the impact of information and communication technology on the success of this process, we prepared an electronic questionnaire that included (70) items, and we also used the SPSS statistical calendar to analyze the data obtained. In the end of our study, our conclusion was that there was a positive correlation between the four dimensions of information and communication technology, i.e., hardware and equipment, software, communication networks, databases, and the re-engineering process, in addition to the fact that the studied institutions attach great importance to formal communication, for its positive advantages that it achieves in reducing time and effort and costs in performing the business. We could also say that communication technology contributes to the process of formulating objectives related to the re-engineering strategy. Finally, we recommend the necessity of empowering workers to use information technology and communication more in enterprises, and to integrate them more into the activity of the enterprise by involving them in the decision-making process, and also to keep pace with the development in the field of software, hardware, and technological equipment.

Keywords: information and communication technology, re-engineering, small and medium enterprises, the impact

Procedia PDF Downloads 169
6472 A POX Controller Module to Collect Web Traffic Statistics in SDN Environment

Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin

Abstract:

Software Defined Networking (SDN) is a new norm of networks. It is designed to facilitate the way of managing, measuring, debugging and controlling the network dynamically, and to make it suitable for the modern applications. Generally, measurement methods can be divided into two categories: Active and passive methods. Active measurement method is employed to inject test packets into the network in order to monitor their behaviour (ping tool as an example). Meanwhile the passive measurement method is used to monitor the traffic for the purpose of deriving measurement values. The measurement methods, both active and passive, are useful for the collection of traffic statistics, and monitoring of the network traffic. Although there has been a work focusing on measuring traffic statistics in SDN environment, it was only meant for measuring packets and bytes rates for non-web traffic. In this study, a feasible method will be designed to measure the number of packets and bytes in a certain time, and facilitate obtaining statistics for both web traffic and non-web traffic. Web traffic refers to HTTP requests that use application layer; while non-web traffic refers to ICMP and TCP requests. Thus, this work is going to be more comprehensive than previous works. With a developed module on POX OpenFlow controller, information will be collected from each active flow in the OpenFlow switch, and presented on Command Line Interface (CLI) and wireshark interface. Obviously, statistics that will be displayed on CLI and on wireshark interfaces include type of protocol, number of bytes and number of packets, among others. Besides, this module will show the number of flows added to the switch whenever traffic is generated from and to hosts in the same statistics list. In order to carry out this work effectively, our Python module will send a statistics request message to the switch requesting its current ports and flows statistics in every five seconds; while the switch will reply with the required information in a message called statistics reply message. Thus, POX controller will be notified and updated with any changes could happen in the entire network in a very short time. Therefore, our aim of this study is to prepare a list for the important statistics elements that are collected from the whole network, to be used for any further researches; particularly, those that are dealing with the detection of the network attacks that cause a sudden rise in the number of packets and bytes like Distributed Denial of Service (DDoS).

Keywords: mininet, OpenFlow, POX controller, SDN

Procedia PDF Downloads 229
6471 Prediction of Changes in Optical Quality by Tissue Redness after Pterygium Surgery

Authors: Mohd Radzi Hilmi, Mohd Zulfaezal Che Azemin, Khairidzan Mohd Kamal, Azrin Esmady Ariffin, Mohd Izzuddin Mohd Tamrin, Norfazrina Abdul Gaffur, Tengku Mohd Tengku Sembok

Abstract:

Purpose: The purpose of this study is to predict optical quality changes after pterygium surgery using tissue redness grading. Methods: Sixty-eight primary pterygium participants were selected from patients who visited an ophthalmology clinic. We developed a semi-automated computer program to measure the pterygium fibrovascular redness from digital pterygium images. The outcome of this software is a continuous scale grading of 1 (minimum redness) to 3 (maximum redness). The region of interest (ROI) was selected manually using the software. Reliability was determined by repeat grading of all 68 images and its association with contrast sensitivity function (CSF) and visual acuity (VA) was examined. Results: The mean and standard deviation of redness of the pterygium fibrovascular images was 1.88 ± 0.55. Intra- and inter-grader reliability estimates were high with intraclass correlation ranging from 0.97 to 0.98. The new grading was positively associated with CSF (p<0.01) and VA (p<0.01). The redness grading was able to predict 25% and 23% of the variance in the CSF and the VA respectively. Conclusions: The new grading of pterygium fibrovascular redness can be reliably measured from digital images and show a good correlation with CSF and VA. The redness grading can be used in addition to the existing pterygium grading.

Keywords: contrast sensitivity, pterygium, redness, visual acuity

Procedia PDF Downloads 506
6470 Durability Analysis of a Knuckle Arm Using VPG System

Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee

Abstract:

A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.

Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)

Procedia PDF Downloads 414
6469 Coding Structures for Seated Row Simulation of an Active Controlled Vibration Isolation and Stabilization System for Astronaut’s Exercise Platform

Authors: Ziraguen O. Williams, Shield B. Lin, Fouad N. Matari, Leslie J. Quiocho

Abstract:

Simulation for seated row exercise was a continued task to assist NASA in analyzing a one-dimensional vibration isolation and stabilization system for astronaut’s exercise platform. Feedback delay and signal noise were added to the model as previously done in simulation for squat exercise. Simulation runs for this study were conducted in two software simulation tools, Trick and MBDyn, software simulation environments developed at the NASA Johnson Space Center. The exciter force in the simulation was calculated from the motion capture of an exerciser during a seated row exercise. The simulation runs include passive control, active control using a Proportional, Integral, Derivative (PID) controller, and active control using a Piecewise Linear Integral Derivative (PWLID) controller. Output parameters include displacements of the exercise platform, the exerciser, and the counterweight; transmitted force to the wall of spacecraft; and actuator force to the platform. The simulation results showed excellent force reduction in the actively controlled system compared to the passive controlled system, which showed less force reduction.

Keywords: control, counterweight, isolation, vibration.

Procedia PDF Downloads 130
6468 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 507
6467 Predicting Success and Failure in Drug Development Using Text Analysis

Authors: Zhi Hao Chow, Cian Mulligan, Jack Walsh, Antonio Garzon Vico, Dimitar Krastev

Abstract:

Drug development is resource-intensive, time-consuming, and increasingly expensive with each developmental stage. The success rates of drug development are also relatively low, and the resources committed are wasted with each failed candidate. As such, a reliable method of predicting the success of drug development is in demand. The hypothesis was that some examples of failed drug candidates are pushed through developmental pipelines based on false confidence and may possess common linguistic features identifiable through sentiment analysis. Here, the concept of using text analysis to discover such features in research publications and investor reports as predictors of success was explored. R studios were used to perform text mining and lexicon-based sentiment analysis to identify affective phrases and determine their frequency in each document, then using SPSS to determine the relationship between our defined variables and the accuracy of predicting outcomes. A total of 161 publications were collected and categorised into 4 groups: (i) Cancer treatment, (ii) Neurodegenerative disease treatment, (iii) Vaccines, and (iv) Others (containing all other drugs that do not fit into the 3 categories). Text analysis was then performed on each document using 2 separate datasets (BING and AFINN) in R within the category of drugs to determine the frequency of positive or negative phrases in each document. A relative positivity and negativity value were then calculated by dividing the frequency of phrases with the word count of each document. Regression analysis was then performed with SPSS statistical software on each dataset (values from using BING or AFINN dataset during text analysis) using a random selection of 61 documents to construct a model. The remaining documents were then used to determine the predictive power of the models. Model constructed from BING predicts the outcome of drug performance in clinical trials with an overall percentage of 65.3%. AFINN model had a lower accuracy at predicting outcomes compared to the BING model at 62.5% but was not effective at predicting the failure of drugs in clinical trials. Overall, the study did not show significant efficacy of the model at predicting outcomes of drugs in development. Many improvements may need to be made to later iterations of the model to sufficiently increase the accuracy.

Keywords: data analysis, drug development, sentiment analysis, text-mining

Procedia PDF Downloads 151
6466 A Study of Adaptive Fault Detection Method for GNSS Applications

Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee

Abstract:

A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.

Keywords: adaptive estimation, fault detection, GNSS, residual

Procedia PDF Downloads 565
6465 CAD Tool for Parametric Design modification of Yacht Hull Surface Models

Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart

Abstract:

Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.

Keywords: design parameter, design constraints, shape modifies, yacht hull

Procedia PDF Downloads 295
6464 Proposal for a Generic Context Meta-Model

Authors: Jaouadi Imen, Ben Djemaa Raoudha, Ben Abdallah Hanene

Abstract:

The access to relevant information that is adapted to users’ needs, preferences and environment is a challenge in many applications running. That causes an appearance of context-aware systems. To facilitate the development of this class of applications, it is necessary that these applications share a common context meta-model. In this article, we will present our context meta-model that is defined using the OMG Meta Object facility (MOF). This meta-model is based on the analysis and synthesis of context concepts proposed in literature.

Keywords: context, meta-model, MOF, awareness system

Procedia PDF Downloads 552
6463 Effect of Substrate Temperature on Structure and Properties of Sputtered Transparent Conducting Film of La-Doped BaSnO₃

Authors: Alok Tiwari, Ming Show Wong

Abstract:

Lanthanum (La) doped Barium Tin Oxide (BaSnO₃) film is an excellent alternative for expensive Transparent Conducting Oxides (TCOs) film such as Indium Tin Oxide (ITO). However single crystal film of La-doped BaSnO₃ has been reported with a good amount of conductivity and transparency but in order to improve its reachability, it is important to grow doped BaSO₃ films on an inexpensive substrate. La-doped BaSnO₃ thin films have been grown on quartz substrate by Radio Frequency (RF) sputtering at a different substrate temperature (from 200⁰C to 750⁰C). The thickness of the film measured was varying from 360nm to 380nm with varying substrate temperature. Structure, optical and electrical properties have been studied. The carrier concentration is seen to be decreasing as we enhance the substrate temperature while mobility found to be increased up to 9.3 cm²/V-S. At low substrate temperature resistivity found was lower (< 3x10⁻³ ohm-cm) while sudden enhancement was seen as substrate temperature raises and the trend continues further with increasing substrate temperature. Optical transmittance is getting better with higher substrate temperature from 70% at 200⁰C to > 80% at 750⁰C. Overall, understanding of changes in microstructure, electrical and optical properties of a thin film by varying substrate temperature has been reported successfully.

Keywords: conductivity, perovskite, mobility, TCO film

Procedia PDF Downloads 157