Search results for: forward collision probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2610

Search results for: forward collision probability

1980 Implementation of Model Reference Adaptive Control in Tuning of Controller Gains for Following-Vehicle System with Fixed Time Headway

Authors: Fatemeh Behbahani, Rubiyah Yusof

Abstract:

To avoid collision between following vehicles and vehicles in front, it is vital to keep appropriate, safe spacing between both vehicles over all speeds. Therefore, the following vehicle needs to have exact information regarding the speed and spacing between vehicles. This project is conducted to simulate the tuning of controller gain for a vehicle-following system through the selected control strategy, spacing control policy and fixed-time headway policy. In addition, the paper simulates and designs an adaptive gain controller for a road-vehicle-following system which uses information on the spacing, velocity and also acceleration of a preceding vehicle in the proposed one-vehicle look-ahead strategy. The mathematical model is implemented using Kirchhoff and Newton’s Laws, and stability simulated. The trial-error method was used to obtain a suitable value of controller gain. However, the adaptive-based controller system was able to optimize the gain value automatically. Model Reference Adaptive Control (MRAC) is designed and utilized and based on firstly the Gradient and secondly the Lyapunov approach. The Lyapunov approach considers stability. The Gradient approach was found to improve the best value of gain in the controller system with fixed-time headway.

Keywords: one-vehicle look-ahead, model reference adaptive, stability, tuning gain controller, MRAC

Procedia PDF Downloads 238
1979 Consequences of Inadequate Funding in Nigerian Educational System

Authors: Sylvia Nkiru Ogbuoji

Abstract:

This paper discussed the consequences of inadequate funding in Nigerian education system. It briefly explained the meaning of education in relation to the context and identified various ways education in Nigeria can be funded. It highlighted some of the consequences of inadequate funding education system to include: Inadequate facilitates for teaching and learning, western brain drain, unemployment, crises of poverty, low staff morale it. Finally, some recommendations were put forward, the government should improve the annual budget allocation to education, in order to achieve educational objective, also government should monitor the utilization of allocated funds to minimize embezzlement.

Keywords: consequences, corruption, education, funding

Procedia PDF Downloads 454
1978 Optical Fiber Data Throughput in a Quantum Communication System

Authors: Arash Kosari, Ali Araghi

Abstract:

A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.

Keywords: absorption, data throughput, depolarization, optical fiber

Procedia PDF Downloads 287
1977 Reducing Accidents Using Text Stops

Authors: Benish Chaudhry

Abstract:

Most of the accidents these days are occurring because of the ‘text-and-drive’ concept. If we look at the structure of cities in UAE, there are great distances, because of which it is impossible to drive without using or merely checking the cellphone. Moreover, if we look at the road structure, it is almost impossible to stop at a point and text. With the introduction of TEXT STOPs, drivers will be able to stop different stops for a maximum of 1 and a half-minute in order to reply or write a message. They can be introduced at a distance of 10 minutes of driving on the average speed of the road, so the drivers can look forward to a stop and can reply to a text when needed. A user survey indicates that drivers are willing to NOT text-and-drive if they have such a facility available.

Keywords: transport, accidents, urban planning, road planning

Procedia PDF Downloads 395
1976 Self-Calibration of Fish-Eye Camera for Advanced Driver Assistance Systems

Authors: Atef Alaaeddine Sarraj, Brendan Jackman, Frank Walsh

Abstract:

Tomorrow’s car will be more automated and increasingly connected. Innovative and intuitive interfaces are essential to accompany this functional enrichment. For that, today the automotive companies are competing to offer an advanced driver assistance system (ADAS) which will be able to provide enhanced navigation, collision avoidance, intersection support and lane keeping. These vision-based functions require an accurately calibrated camera. To achieve such differentiation in ADAS requires sophisticated sensors and efficient algorithms. This paper explores the different calibration methods applicable to vehicle-mounted fish-eye cameras with arbitrary fields of view and defines the first steps towards a self-calibration method that adequately addresses ADAS requirements. In particular, we present a self-calibration method after comparing different camera calibration algorithms in the context of ADAS requirements. Our method gathers data from unknown scenes while the car is moving, estimates the camera intrinsic and extrinsic parameters and corrects the wide-angle distortion. Our solution enables continuous and real-time detection of objects, pedestrians, road markings and other cars. In contrast, other camera calibration algorithms for ADAS need pre-calibration, while the presented method calibrates the camera without prior knowledge of the scene and in real-time.

Keywords: advanced driver assistance system (ADAS), fish-eye, real-time, self-calibration

Procedia PDF Downloads 252
1975 Low Energy Technology for Leachate Valorisation

Authors: Jesús M. Martín, Francisco Corona, Dolores Hidalgo

Abstract:

Landfills present long-term threats to soil, air, groundwater and surface water due to the formation of greenhouse gases (methane gas and carbon dioxide) and leachate from decomposing garbage. The composition of leachate differs from site to site and also within the landfill. The leachates alter with time (from weeks to years) since the landfilled waste is biologically highly active and their composition varies. Mainly, the composition of the leachate depends on factors such as characteristics of the waste, the moisture content, climatic conditions, degree of compaction and the age of the landfill. Therefore, the leachate composition cannot be generalized and the traditional treatment models should be adapted in each case. Although leachate composition is highly variable, what different leachates have in common is hazardous constituents and their potential eco-toxicological effects on human health and on terrestrial ecosystems. Since leachate has distinct compositions, each landfill or dumping site would represent a different type of risk on its environment. Nevertheless, leachates consist always of high organic concentration, conductivity, heavy metals and ammonia nitrogen. Leachate could affect the current and future quality of water bodies due to uncontrolled infiltrations. Therefore, control and treatment of leachate is one of the biggest issues in urban solid waste treatment plants and landfills design and management. This work presents a treatment model that will be carried out "in-situ" using a cost-effective novel technology that combines solar evaporation/condensation plus forward osmosis. The plant is powered by renewable energies (solar energy, biomass and residual heat), which will minimize the carbon footprint of the process. The final effluent quality is very high, allowing reuse (preferred) or discharge into watercourses. In the particular case of this work, the final effluents will be reused for cleaning and gardening purposes. A minority semi-solid residual stream is also generated in the process. Due to its special composition (rich in metals and inorganic elements), this stream will be valorized in ceramic industries to improve the final products characteristics.

Keywords: forward osmosis, landfills, leachate valorization, solar evaporation

Procedia PDF Downloads 204
1974 Investigation of the Role of Friction in Reducing Pedestrian Injuries in Accidents at Intersections

Authors: Seyed Abbas Tabatabaei, Afshin Ghanbarzadeh, Mehdi Abidizadeh

Abstract:

Nowadays the subject of road traffic accidents and the high social and economic costs due to them is the most fundamental problem that experts and providers of transport and traffic brought to a challenge. One of the most effective measures is to enhance the skid resistance of road surface. This research aims to study the intersection of one case in Ahwaz and the effect of increasing the skid resistance in reducing pedestrian injuries in accidents at intersections. In this research the device was developed to measure the coefficient of friction and tried the rules and practices of it have a high similarity with the Locked Wheel Trailer. This device includes a steel frame, wheels, hydration systems, and force gauge. The output of the device is that the force gauge registers. By investigate this data and applying the relationships relative surface coefficient of friction is obtained. Friction coefficient data for the current state and the state of the new pavement are obtained and plotted on the graphs based on the graphs we can compare the two situations and speed at the moment of collision between the two modes are compared. The results show that increasing the coefficient of friction to what extent can be effective on the severity and number of accidents.

Keywords: intersection, coefficient of friction, skid resistance, locked wheels, accident, pedestrian

Procedia PDF Downloads 328
1973 Aggregation Scheduling Algorithms in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.

Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional

Procedia PDF Downloads 231
1972 Reliability-Based Life-Cycle Cost Model for Engineering Systems

Authors: Reza Lotfalian, Sudarshan Martins, Peter Radziszewski

Abstract:

The effect of reliability on life-cycle cost, including initial and maintenance cost of a system is studied. The failure probability of a component is used to calculate the average maintenance cost during the operation cycle of the component. The standard deviation of the life-cycle cost is also calculated as an error measure for the average life-cycle cost. As a numerical example, the model is used to study the average life cycle cost of an electric motor.

Keywords: initial cost, life-cycle cost, maintenance cost, reliability

Procedia PDF Downloads 607
1971 Modern Agriculture and Employment Generation in Nigeria: A Recursive Model Approach

Authors: Ese Urhie, Olabisi Popoola, Obindah Gershon

Abstract:

Several policies and programs initiated to address the challenge of unemployment in Nigeria seem to be inadequate. The desired structural transformation which is expected to absorb the excess labour in the economy is yet to be achieved. The agricultural sector accounts for almost half of the labour force with very low productivity. This could partly explain why the much anticipated structural transformation has not been achieved. A major reason for the low productivity is the fact that the production process is predominantly based on the use of traditional tools. In view of the underdeveloped nature of the agricultural sector, Nigeria still has huge potentials for productivity enhancement through modern technology. Aside from productivity enhancement, modern agriculture also stimulates both backward and forward linkages that promote investment and thus generate employment. Contrary to the apprehension usually expressed by many stake-holders about the adoption of modern technology by labour-abundant less-developed countries, this study showed that though there will be job loss initially, the reverse will be the case in the long-run. The outcome of this study will enhance the understanding of all stakeholders in the sector and also encourage them to adopt modern techniques of farming. It will also aid policy formulation at both sectoral and national levels. The recursive model and analysis adopted in the study is useful because it exhibits a unilateral cause-and-effect relationship which most simultaneous equation models do not. It enables the structural equations to be ordered in such a way that the first equation includes only predetermined variables on the right-hand side, while the solution for the final endogenous variable is completely determined by all equations of the system. The study examines the transmission channels and effect of modern agriculture on agricultural productivity and employment growth in Nigeria, via its forward and backward linkages. Using time series data spanning 1980 to 2014, the result of the analyses shows that: (i) a significant and positive relationship between agricultural productivity growth and modern agriculture; (ii) a significant and negative relationship between export price index and agricultural productivity growth; (iii) a significant and positive relationship between export and investment; and (iv) a significant and positive relationship between investment and employment growth. The unbalanced growth theory will be a good strategy to adopt by developing countries such as Nigeria.

Keywords: employment, modern agriculture, productivity, recursive model

Procedia PDF Downloads 269
1970 A Theoretical Hypothesis on Ferris Wheel Model of University Social Responsibility

Authors: Le Kang

Abstract:

According to the nature of the university, as a free and responsible academic community, USR is based on a different foundation —academic responsibility, so the Pyramid and the IC Model of CSR could not fully explain the most distinguished feature of USR. This paper sought to put forward a new model— Ferris Wheel Model, to illustrate the nature of USR and the process of achievement. The Ferris Wheel Model of USR shows the university creates a balanced, fairness and neutrality systemic structure to afford social responsibilities; that makes the organization could obtain a synergistic effect to achieve more extensive interests of stakeholders and wider social responsibilities.

Keywords: USR, achievement model, ferris wheel model, social responsibilities

Procedia PDF Downloads 725
1969 The Consumer's Behavior of Bakery Products in Bangkok

Authors: Jiraporn Weenuttranon

Abstract:

The objectives of the consumer behavior of bakery products in Bangkok are to study consumer behavior of the bakery product, to study the essential factors that could possibly affect the consumer behavior and to study recommendations for the development of the bakery products. This research is a survey research. Populations are buyer’s bakery products in Bangkok. The probability sample size is 400. The research uses a questionnaire for self-learning by using information technology. The researcher created a reliability value at 0.71 levels of significance. The data analysis will be done by using the percentage, mean, and standard deviation and testing the hypotheses by using chi-square.

Keywords: consumer, behavior, bakery, standard deviation

Procedia PDF Downloads 483
1968 The Work System Method for Designing Knowledge Mobilization Projects

Authors: Chihab Benmoussa

Abstract:

Could the Work System Approach (WSA) function as a framework for designing high-impact knowledge mobilization systems? This paper put forward arguments in favor of the applicability of WSA for knowledge mobilization design based on evidences from a practical research. Normative approaches for practitioners are highly needed especially in the field of knowledge management (KM), given the abysmal rate of disappointment and failure of KM projects. The paper contrasts knowledge management and knowledge mobilization, presents the WSA and showed how the WSA’s concepts and ideas fit with the approach adopted by a multinational company in designing a successful knowledge mobilization initiative.

Keywords: knowledge management, knowledge mobilizations, work system method

Procedia PDF Downloads 524
1967 Research on the Environmental Assessment Index of Brownfield Redevelopment in Taiwan: A Case Study on Formosa Chemicals and Fibre Corporation, Changhua Branch

Authors: Min-Chih Yang, Shih-Jen Feng, Bo-Tsang Li

Abstract:

The concept of “Brownfield” has been developed for nearly 35 years since it was put forward in 《Comprehensive Environmental Response, Compensation, and Liability Act, CERCLA》of USA in 1980 for solving the problem of soil contamination of those old industrial lands, and later, many countries have put forward relevant policies and researches continuously. But the related concept in Taiwan, a country has developed its industry for 60 years, is still in its infancy. This leads to the slow development of Brownfield related research and policy in Taiwan. When it comes to build the foundation of Brownfield development, we have to depend on the related experience and research of other countries. They are four aspects about Brownfield: 1. Contaminated Land; 2. Derelict Land; 3. Vacant Land; 4. Previously Development Land. This study will focus on and deeply investigate the Vacant land and contaminated land. The subject of this study is Formosa Chemicals & Fibre Corporation, Changhua branch in Taiwan. It has been operating for nearly 50 years and contributing a lot to the local economy. But under the influence of the toxic waste and sewage which was drained regularly or occasionally out from the factory, the environment has been destroyed seriously. There are three factors of pollution: 1. environmental toxicants, carbon disulfide, released from producing processes and volatile gases which is hard to monitor; 2. Waste and exhaust gas leakage caused by outdated equipment; 3. the wastewater discharge has seriously damage the ecological environment of the Dadu river estuary. Because of all these bad influences, the factory has been closed nowadays and moved to other places to spare the opportunities for the contaminated lands to re-develop. So we collect information about related Brownfield management experience and policies in different countries as background information to investigate the current Taiwanese Brownfield redevelopment issues and built the environmental assessment framework for it. We hope that we can set the environmental assessment indexes for Formosa Chemicals & Fibre Corporation, Changhua branch according to the framework. By investigating the theory and environmental pollution factors, we will carry out deep analysis and expert questionnaire to set those indexes and prove a sample in Taiwan for Brownfield redevelopment and remediation in the future.

Keywords: brownfield, industrial land, redevelopment, assessment index

Procedia PDF Downloads 402
1966 Reliability-Simulation of Composite Tubular Structure under Pressure by Finite Elements Methods

Authors: Abdelkader Hocine, Abdelhakim Maizia

Abstract:

The exponential growth of reinforced fibers composite materials use has prompted researchers to step up their work on the prediction of their reliability. Owing to differences between the properties of the materials used for the composite, the manufacturing processes, the load combinations and types of environment, the prediction of the reliability of composite materials has become a primary task. Through failure criteria, TSAI-WU and the maximum stress, the reliability of multilayer tubular structures under pressure is the subject of this paper, where the failure probability of is estimated by the method of Monte Carlo.

Keywords: composite, design, monte carlo, tubular structure, reliability

Procedia PDF Downloads 465
1965 Overview of Risk Management in Electricity Markets Using Financial Derivatives

Authors: Aparna Viswanath

Abstract:

Electricity spot prices are highly volatile under optimal generation capacity scenarios due to factors such as non-storability of electricity, peak demand at certain periods, generator outages, fuel uncertainty for renewable energy generators, huge investments and time needed for generation capacity expansion etc. As a result market participants are exposed to price and volume risk, which has led to the development of risk management practices. This paper provides an overview of risk management practices by market participants in electricity markets using financial derivatives.

Keywords: financial derivatives, forward, futures, options, risk management

Procedia PDF Downloads 480
1964 Reliability Analysis of a Fuel Supply System in Automobile Engine

Authors: Chitaranjan Sharma

Abstract:

The present paper deals with the analysis of a fuel supply system in an automobile engine of a four wheeler which is having both the option of fuel i.e. PETROL and CNG. Since CNG is cheaper than petrol so the priority is given to consume CNG as compared to petrol. An automatic switch is used to start petrol supply at the time of failure of CNG supply. Using regenerative point technique with Markov renewal process, the reliability characteristics which are useful to system designers are obtained.

Keywords: reliability, redundancy, repair time, transition, probability, regenerative points, markov renewal, process

Procedia PDF Downloads 551
1963 Recombination Rate Coefficients for NIII and OIV Ions

Authors: Shahin A. Abdel-Naby, Asad T. Hassan

Abstract:

Electron-ion recombination data are needed for plasma modeling. The recombination processes include radiative recombination (RR), dielectronic recombination (DR), and trielectronic recombination (TR). When a free electron is captured by an ion with simultaneous excitation of its core, a doubly-exited intermediate state may be formed. The doubly excited state relaxes either by electron emission (autoionization) or by radiative decay (photon emission). DR process takes place when the relaxation occurs to a bound state by photon emission. Reliable laboratory astrophysics data (theory and experiment) for DR rate coefficients are needed to determine the charge state distribution in photoionized sources such as X-ray binaries and active galactic nuclei. DR rate coefficients for NIII and OIV ions are calculated using state-of-the-art multi-configuration Breit-Pauli atomic structure AUTOSTRUCTURE collisional package within the generalized collisional-radiative framework. Level-resolved calculations for RR and DR rate coefficients from the ground and metastable initial states are produced in an intermediate coupling scheme associated with Δn = 0 (2→2) and Δn = 1 (2 →3) core-excitations. DR cross sections for these ions are convoluted with the experimental electron-cooler temperatures to produce DR rate coefficients. Good agreements are found between these rate coefficients and the experimental measurements performed at the CRYRING heavy-ion storage ring for both ions.

Keywords: atomic data, atomic process, electron-ion collision, plasmas

Procedia PDF Downloads 153
1962 VaR or TCE: Explaining the Preferences of Regulators

Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski

Abstract:

While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.

Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure

Procedia PDF Downloads 172
1961 Numerical Simulation on Airflow Structure in the Human Upper Respiratory Tract Model

Authors: Xiuguo Zhao, Xudong Ren, Chen Su, Xinxi Xu, Fu Niu, Lingshuai Meng

Abstract:

The respiratory diseases such as asthma, emphysema and bronchitis are connected with the air pollution and the number of these diseases tends to increase, which may attribute to the toxic aerosol deposition in human upper respiratory tract or in the bifurcation of human lung. The therapy of these diseases mostly uses pharmaceuticals in the form of aerosol delivered into the human upper respiratory tract or the lung. Understanding of airflow structures in human upper respiratory tract plays a very important role in the analysis of the “filtering” effect in the pharynx/larynx and for obtaining correct air-particle inlet conditions to the lung. However, numerical simulation based CFD (Computational Fluid Dynamics) technology has its own advantage on studying airflow structure in human upper respiratory tract. In this paper, a representative human upper respiratory tract is built and the CFD technology was used to investigate the air movement characteristic in the human upper respiratory tract. The airflow movement characteristic, the effect of the airflow movement on the shear stress distribution and the probability of the wall injury caused by the shear stress are discussed. Experimentally validated computational fluid-aerosol dynamics results showed the following: the phenomenon of airflow separation appears near the outer wall of the pharynx and the trachea. The high velocity zone is created near the inner wall of the trachea. The airflow splits at the divider and a new boundary layer is generated at the inner wall of the downstream from the bifurcation with the high velocity near the inner wall of the trachea. The maximum velocity appears at the exterior of the boundary layer. The secondary swirls and axial velocity distribution result in the high shear stress acting on the inner wall of the trachea and bifurcation, finally lead to the inner wall injury. The enhancement of breathing intensity enhances the intensity of the shear stress acting on the inner wall of the trachea and the bifurcation. If human keep the high breathing intensity for long time, not only the ability for the transportation and regulation of the gas through the trachea and the bifurcation fall, but also result in the increase of the probability of the wall strain and tissue injury.

Keywords: airflow structure, computational fluid dynamics, human upper respiratory tract, wall shear stress, numerical simulation

Procedia PDF Downloads 248
1960 Integrating Technology in Teaching and Learning Mathematics

Authors: Larry Wang

Abstract:

The aim of this paper is to demonstrate how an online homework system is integrated in teaching and learning mathematics and how it improves the student success rates in some gateway mathematics courses. WeBWork provided by the Mathematical Association of America is adopted as the online homework system. During the period of 2010-2015, the system was implemented in classes of precalculus, calculus, probability and statistics, discrete mathematics, linear algebra, and differential equations. As a result, the passing rates of the sections with WeBWork are well above other sections without WeBWork (about 7-10% higher). The paper also shows how the WeBWork system was used.

Keywords: gateway mathematics, online grading, pass rate, WeBWorK

Procedia PDF Downloads 299
1959 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 390
1958 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 127
1957 Pay Per Click Attribution: Effects on Direct Search Traffic and Purchases

Authors: Toni Raurich-Marcet, Joan Llonch-Andreu

Abstract:

This research is focused on the relationship between Search Engine Marketing (SEM) and traditional advertising. The dominant assumption is that SEM does not help brand awareness and only does it in session as if it were the cost of manufacturing the product being sold. The study is methodologically developed using an experiment where the effects were determined to analyze the billboard effect. The research allowed the cross-linking of theoretical and empirical knowledge on digital marketing. This paper has validated this marketing generates retention as traditional advertising would by measuring brand awareness and its improvements. This changes the way performance and brand campaigns are split within marketing departments, effectively rebalancing budgets moving forward.

Keywords: attribution, performance marketing, SEM, marketplaces

Procedia PDF Downloads 131
1956 Efficient Single Relay Selection Scheme for Cooperative Communication

Authors: Sung-Bok Choi, Hyun-Jun Shin, Hyoung-Kyu Song

Abstract:

This paper proposes a single relay selection scheme in cooperative communication. Decode and forward scheme is considered when a source node wants to cooperate with a single relay for data transmission. To use the proposed single relay selection scheme, the source node make a little different pattern signal which is not complex pattern and broadcasts it. The proposed scheme does not require the channel state information between the source node and candidates of the relay during the relay selection. Therefore, it is able to be used in many fields.

Keywords: relay selection, cooperative communication, df, channel codes

Procedia PDF Downloads 671
1955 Optimum Design of Helical Gear System on Basis of Maximum Power Transmission Capability

Authors: Yasaman Esfandiari

Abstract:

Mechanical engineering has always dealt with amplification of the input power in power trains. One of the ways to achieve this goal is to use gears to change the amplitude and direction of the torque and the speed. However, the gears should be optimally designed to best achieve these objectives. In this study, helical gear systems are optimized to achieve maximum power. Material selection, space restriction, available facilities for manufacturing, the probability of tooth breakage, and tooth wear are taken into account and governing equations are derived. Finally, a Matlab code was generated to solve the optimization problem and the results are verified.

Keywords: design, gears, Matlab, optimization

Procedia PDF Downloads 240
1954 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer

Authors: Ravinder Bahl, Jamini Sharma

Abstract:

The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.

Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning

Procedia PDF Downloads 360
1953 Meitu and the Case of the AI Art Movement

Authors: Taliah Foudah, Sana Masri, Jana Al Ghamdi, Rimaz Alzaaqi

Abstract:

This research project explores the creative works of the app Metui, which allows consumers to edit their photos and use the new and popular AI feature, which turns any photo into a cartoon-like animated image with beautified enhancements. Studying this AI app demonstrates the significance of the ability in which AI can develop intricate designs which verily replicate the human mind. Our goal was to investigate the Metui app by asking our audience certain questions about its functionality and their personal feelings about its credibility as well as their beliefs as to how this app will add to the future of the AI generation, both positively and negatively. Their responses were further explored by analyzing the questions and responses thoroughly and calculating the results through pie charts. Overall, it was concluded that the Metui app is a powerful step forward for AI by replicating the intelligence of humans and its creativity to either benefit society or do the opposite.

Keywords: AI Art, Meitu, application, photo editing

Procedia PDF Downloads 69
1952 Mood Recognition Using Indian Music

Authors: Vishwa Joshi

Abstract:

The study of mood recognition in the field of music has gained a lot of momentum in the recent years with machine learning and data mining techniques and many audio features contributing considerably to analyze and identify the relation of mood plus music. In this paper we consider the same idea forward and come up with making an effort to build a system for automatic recognition of mood underlying the audio song’s clips by mining their audio features and have evaluated several data classification algorithms in order to learn, train and test the model describing the moods of these audio songs and developed an open source framework. Before classification, Preprocessing and Feature Extraction phase is necessary for removing noise and gathering features respectively.

Keywords: music, mood, features, classification

Procedia PDF Downloads 500
1951 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 135