Search results for: random match probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3535

Search results for: random match probability

2695 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 154
2694 Map Matching Performance under Various Similarity Metrics for Heterogeneous Robot Teams

Authors: M. C. Akay, A. Aybakan, H. Temeltas

Abstract:

Aerial and ground robots have various advantages of usage in different missions. Aerial robots can move quickly and get a different sight of view of the area, but those vehicles cannot carry heavy payloads. On the other hand, unmanned ground vehicles (UGVs) are slow moving vehicles, since those can carry heavier payloads than unmanned aerial vehicles (UAVs). In this context, we investigate the performances of various Similarity Metrics to provide a common map for Heterogeneous Robot Team (HRT) in complex environments. Within the usage of Lidar Odometry and Octree Mapping technique, the local 3D maps of the environment are gathered.  In order to obtain a common map for HRT, informative theoretic similarity metrics are exploited. All types of these similarity metrics gave adequate as allowable simulation time and accurate results that can be used in different types of applications. For the heterogeneous multi robot team, those methods can be used to match different types of maps.

Keywords: common maps, heterogeneous robot team, map matching, informative theoretic similarity metrics

Procedia PDF Downloads 152
2693 Principles of Teaching for Successful Intelligence

Authors: Shabnam

Abstract:

The purpose of this study was to see importance of successful intelligence in education which can enhance achievement. There are a number of researches which have tried to apply psychological theories of education and many researches emphasized the role of thinking and intelligence. While going through the various researches, it was found that many students could learn more effectively than they do, if they were taught in a way that better matched their patterns of abilities. Attempts to apply psychological theories to education can falter on the translation of the theory into educational practice. Often, this translation is not clear. Therefore, when a program does not succeed, it is not clear whether the lack of success was due to the inadequacy of the theory or the inadequacy of the implementation of the theory. A set of basic principles for translating a theory into practice can help clarify just what an educational implementation should (and should not) look like. Sternberg’s theory of successful intelligence; analytical, creative and practical intelligence provides a way to create such a match. The results suggest that theory of successful intelligence provides successful interventions in classrooms and provides a proven model for gifted education. This article presents principles for translating a triarchic theory of successful intelligence into educational practice.

Keywords: successful intelligence, analytical, creative and practical intelligence, achievement, success, resilience

Procedia PDF Downloads 576
2692 The Consumer's Behavior of Bakery Products in Bangkok

Authors: Jiraporn Weenuttranon

Abstract:

The objectives of the consumer behavior of bakery products in Bangkok are to study consumer behavior of the bakery product, to study the essential factors that could possibly affect the consumer behavior and to study recommendations for the development of the bakery products. This research is a survey research. Populations are buyer’s bakery products in Bangkok. The probability sample size is 400. The research uses a questionnaire for self-learning by using information technology. The researcher created a reliability value at 0.71 levels of significance. The data analysis will be done by using the percentage, mean, and standard deviation and testing the hypotheses by using chi-square.

Keywords: consumer, behavior, bakery, standard deviation

Procedia PDF Downloads 464
2691 Numerical Simulation of Flexural Strength of Steel Fiber Reinforced High Volume Fly Ash Concrete by Finite Element Analysis

Authors: Mahzabin Afroz, Indubhushan Patnaikuni, Srikanth Venkatesan

Abstract:

It is well-known that fly ash can be used in high volume as a partial replacement of cement to get beneficial effects on concrete. High volume fly ash (HVFA) concrete is currently emerging as a popular option to strengthen by fiber. Although studies have supported the use of fibers with fly ash, a unified model along with the incorporation into finite element software package to estimate the maximum flexural loads need to be developed. In this study, nonlinear finite element analysis of steel fiber reinforced high strength HVFA concrete beam under static loadings was conducted to investigate their failure modes in terms of ultimate load. First of all, the experimental investigation of mechanical properties of high strength HVFA concrete was done and validates with developed numerical model with the appropriate modeling of element size and mesh by ANSYS 16.2. To model the fiber within the concrete, three-dimensional random fiber distribution was simulated by spherical coordinate system. Three types of high strength HVFA concrete beams were analyzed reinforced with 0.5, 1 and 1.5% volume fractions of steel fibers with specific mechanical and physical properties. The result reveals that the use of nonlinear finite element analysis technique and three-dimensional random fiber orientation exhibited fairly good agreement with the experimental results of flexural strength, load deflection and crack propagation mechanism. By utilizing this improved model, it is possible to determine the flexural behavior of different types and proportions of steel fiber reinforced HVFA concrete beam under static load. So, this paper has the originality to predict the flexural properties of steel fiber reinforced high strength HVFA concrete by numerical simulations.

Keywords: finite element analysis, high volume fly ash, steel fibers, spherical coordinate system

Procedia PDF Downloads 125
2690 Reliability-Simulation of Composite Tubular Structure under Pressure by Finite Elements Methods

Authors: Abdelkader Hocine, Abdelhakim Maizia

Abstract:

The exponential growth of reinforced fibers composite materials use has prompted researchers to step up their work on the prediction of their reliability. Owing to differences between the properties of the materials used for the composite, the manufacturing processes, the load combinations and types of environment, the prediction of the reliability of composite materials has become a primary task. Through failure criteria, TSAI-WU and the maximum stress, the reliability of multilayer tubular structures under pressure is the subject of this paper, where the failure probability of is estimated by the method of Monte Carlo.

Keywords: composite, design, monte carlo, tubular structure, reliability

Procedia PDF Downloads 447
2689 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task

Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes

Abstract:

For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.

Keywords: Alzheimer's disease, keystroke logging, matching, writing process

Procedia PDF Downloads 357
2688 Good Supply Chain Management A Factor for Business Performance

Authors: Irina Canco, Amela Malaj

Abstract:

It is evident that there exists a relationship between supply chain management and business performance. Surveys have showed that in many cases the manager's beliefs and expectations on supply chain management do not match the reality of the business. In this context, the study of supply chain issues is of particular importance and interest considering specifically the current period. The economic problems of this period, are present in Albania as well. The complexity of the supply chain focuses on order fulfilment. Therefore, in this paper, attention will be paid to the impact of supply chain management on business performance. The objective of the paper is to find a relationship between the good supply chain management and business performance. This research is based on the results of surveys referring to the experience of successful businesses on issues related to sustainable supply chain management and its synchronization with the provision of products and services required by the final customers. This study clearly evidenced the impact of the speed of meeting customer requirements on AMAZONA performance. This was also confirmed mathematically through one of the decision criteria in conditions of uncertainty—Laplace criterion.

Keywords: supply chain management, AMAZONA, business performance, Laplace criteria

Procedia PDF Downloads 156
2687 Reliability Analysis of a Fuel Supply System in Automobile Engine

Authors: Chitaranjan Sharma

Abstract:

The present paper deals with the analysis of a fuel supply system in an automobile engine of a four wheeler which is having both the option of fuel i.e. PETROL and CNG. Since CNG is cheaper than petrol so the priority is given to consume CNG as compared to petrol. An automatic switch is used to start petrol supply at the time of failure of CNG supply. Using regenerative point technique with Markov renewal process, the reliability characteristics which are useful to system designers are obtained.

Keywords: reliability, redundancy, repair time, transition, probability, regenerative points, markov renewal, process

Procedia PDF Downloads 539
2686 VaR or TCE: Explaining the Preferences of Regulators

Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski

Abstract:

While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.

Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure

Procedia PDF Downloads 160
2685 Numerical Simulation on Airflow Structure in the Human Upper Respiratory Tract Model

Authors: Xiuguo Zhao, Xudong Ren, Chen Su, Xinxi Xu, Fu Niu, Lingshuai Meng

Abstract:

The respiratory diseases such as asthma, emphysema and bronchitis are connected with the air pollution and the number of these diseases tends to increase, which may attribute to the toxic aerosol deposition in human upper respiratory tract or in the bifurcation of human lung. The therapy of these diseases mostly uses pharmaceuticals in the form of aerosol delivered into the human upper respiratory tract or the lung. Understanding of airflow structures in human upper respiratory tract plays a very important role in the analysis of the “filtering” effect in the pharynx/larynx and for obtaining correct air-particle inlet conditions to the lung. However, numerical simulation based CFD (Computational Fluid Dynamics) technology has its own advantage on studying airflow structure in human upper respiratory tract. In this paper, a representative human upper respiratory tract is built and the CFD technology was used to investigate the air movement characteristic in the human upper respiratory tract. The airflow movement characteristic, the effect of the airflow movement on the shear stress distribution and the probability of the wall injury caused by the shear stress are discussed. Experimentally validated computational fluid-aerosol dynamics results showed the following: the phenomenon of airflow separation appears near the outer wall of the pharynx and the trachea. The high velocity zone is created near the inner wall of the trachea. The airflow splits at the divider and a new boundary layer is generated at the inner wall of the downstream from the bifurcation with the high velocity near the inner wall of the trachea. The maximum velocity appears at the exterior of the boundary layer. The secondary swirls and axial velocity distribution result in the high shear stress acting on the inner wall of the trachea and bifurcation, finally lead to the inner wall injury. The enhancement of breathing intensity enhances the intensity of the shear stress acting on the inner wall of the trachea and the bifurcation. If human keep the high breathing intensity for long time, not only the ability for the transportation and regulation of the gas through the trachea and the bifurcation fall, but also result in the increase of the probability of the wall strain and tissue injury.

Keywords: airflow structure, computational fluid dynamics, human upper respiratory tract, wall shear stress, numerical simulation

Procedia PDF Downloads 230
2684 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 259
2683 Flutter Control Analysis of an Aircraft Wing Using Carbon Nanotubes Reinforced Polymer

Authors: Timothee Gidenne, Xia Pinqi

Abstract:

In this paper, an investigation of the use of carbon nanotubes (CNTs) reinforced polymer as an actuator for an active flutter suppression to counter the flutter phenomena is conducted. The goal of this analysis is to establish a link between the behavior of the control surface and the actuators to demonstrate the veracity of using such a suppression system for the aeronautical field. A preliminary binary flutter model using simplified unsteady aerodynamics is developed to study the behavior of the wing while reaching the flutter speed and when the control system suppresses the flutter phenomena. The Timoshenko beam theory for bilayer materials is used to match the response of the control surface with the CNTs reinforced polymer (CNRP) actuators. According to Timoshenko theory, results show a good and realistic response for such a purpose. Even if the results are still preliminary, they show evidence of the potential use of CNRP for control surface actuation for the small-scale and lightweight system.

Keywords: actuators, aeroelastic, aeroservoelasticity, carbon nanotubes, flutter, flutter suppression

Procedia PDF Downloads 114
2682 Improving Search Engine Performance by Removing Indexes to Malicious URLs

Authors: Durga Toshniwal, Lokesh Agrawal

Abstract:

As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.

Keywords: web crawler, malwares, seeds, drive-by-downloads, security

Procedia PDF Downloads 219
2681 Conjugal Relationship and Reproductive Decision-Making among Couples in Southwest Nigeria

Authors: Peter Olasupo Ogunjuyigbe, Sarafa Shittu

Abstract:

This paper emphasizes the relevance of conjugal relationship and spousal communication towards enhancing men’s involvement in contraceptive use among the Yorubas of South Western Nigeria. An understanding of males influence and the role they play in reproductive decision making can throw better light on mechanisms through which egalitarianness of husband/wife decision making influences contraceptive use. The objective of this study was to investigate how close conjugal relationships can be a good indicator of joint decision making among couples using data derived from a survey conducted in three states of South Western Nigeria. The study sample consisted of five hundred and twenty one (521) male respondents aged 15-59 years and five hundred and forty seven (547) female respondents aged 15-49 years. The study used both quantitative and qualitative approached to elicit information from the respondents. In order that the study would be truly representative of the towns, each of the study locations in the capital cities was divided into four strata: The traditional area, the migrant area, the mixed area (i.e. traditional and migrant), and the elite area. In the rural areas, selection of the respondents was by simple random sampling technique. However, the random selection was made in such a way that all the different parts of the locations were represented. Generally, the data collected were analysed at univariate, bivariate, and multivariate levels. Logistic regression models were employed to examine the interrelationships between male reproductive behaviour, conjugal relationship and contraceptive use. The study indicates that current use of contraceptive is high among this major ethnic group in Nigeria because of the improved level of communication among couples. The problem, however, is that men still have lower exposure rate when it comes to question of family planning information, education and counseling. This has serious implications on fertility regulation in Nigeria.

Keywords: behavior, conjugal, communication, counseling, spouse

Procedia PDF Downloads 127
2680 A Framework for Automated Nuclear Waste Classification

Authors: Seonaid Hume, Gordon Dobie, Graeme West

Abstract:

Detecting and localizing radioactive sources is a necessity for safe and secure decommissioning of nuclear facilities. An important aspect for the management of the sort-and-segregation process is establishing the spatial distributions and quantities of the waste radionuclides, their type, corresponding activity, and ultimately classification for disposal. The data received from surveys directly informs decommissioning plans, on-site incident management strategies, the approach needed for a new cell, as well as protecting the workforce and the public. Manual classification of nuclear waste from a nuclear cell is time-consuming, expensive, and requires significant expertise to make the classification judgment call. Also, in-cell decommissioning is still in its relative infancy, and few techniques are well-developed. As with any repetitive and routine tasks, there is the opportunity to improve the task of classifying nuclear waste using autonomous systems. Hence, this paper proposes a new framework for the automatic classification of nuclear waste. This framework consists of five main stages; 3D spatial mapping and object detection, object classification, radiological mapping, source localisation based on gathered evidence and finally, waste classification. The first stage of the framework, 3D visual mapping, involves object detection from point cloud data. A review of related applications in other industries is provided, and recommendations for approaches for waste classification are made. Object detection focusses initially on cylindrical objects since pipework is significant in nuclear cells and indeed any industrial site. The approach can be extended to other commonly occurring primitives such as spheres and cubes. This is in preparation of stage two, characterizing the point cloud data and estimating the dimensions, material, degradation, and mass of the objects detected in order to feature match them to an inventory of possible items found in that nuclear cell. Many items in nuclear cells are one-offs, have limited or poor drawings available, or have been modified since installation, and have complex interiors, which often and inadvertently pose difficulties when accessing certain zones and identifying waste remotely. Hence, this may require expert input to feature match objects. The third stage, radiological mapping, is similar in order to facilitate the characterization of the nuclear cell in terms of radiation fields, including the type of radiation, activity, and location within the nuclear cell. The fourth stage of the framework takes the visual map for stage 1, the object characterization from stage 2, and radiation map from stage 3 and fuses them together, providing a more detailed scene of the nuclear cell by identifying the location of radioactive materials in three dimensions. The last stage involves combining the evidence from the fused data sets to reveal the classification of the waste in Bq/kg, thus enabling better decision making and monitoring for in-cell decommissioning. The presentation of the framework is supported by representative case study data drawn from an application in decommissioning from a UK nuclear facility. This framework utilises recent advancements of the detection and mapping capabilities of complex radiation fields in three dimensions to make the process of classifying nuclear waste faster, more reliable, cost-effective and safer.

Keywords: nuclear decommissioning, radiation detection, object detection, waste classification

Procedia PDF Downloads 190
2679 Integrating Technology in Teaching and Learning Mathematics

Authors: Larry Wang

Abstract:

The aim of this paper is to demonstrate how an online homework system is integrated in teaching and learning mathematics and how it improves the student success rates in some gateway mathematics courses. WeBWork provided by the Mathematical Association of America is adopted as the online homework system. During the period of 2010-2015, the system was implemented in classes of precalculus, calculus, probability and statistics, discrete mathematics, linear algebra, and differential equations. As a result, the passing rates of the sections with WeBWork are well above other sections without WeBWork (about 7-10% higher). The paper also shows how the WeBWork system was used.

Keywords: gateway mathematics, online grading, pass rate, WeBWorK

Procedia PDF Downloads 282
2678 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 373
2677 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 107
2676 Chemical Fingerprinting of the Ephedrine Pathway to Methamphetamine

Authors: Luke Andrighetto, Paul G. Stevenson, Luke C. Henderson, Jim Pearson, Xavier A. Conlan

Abstract:

As pseudoephedrine, a common ingredient in cold and flu medications is closely monitored and restricted in Australia, alternative methods of accessing it are of interest. The impurities and by-products of every reaction step of pseudoephedrine/ephedrine and methamphetamine synthesis have been mapped in order to develop a chemical fingerprint based on synthetic route. Likewise, seized methamphetamine contains a combination of different cutting agents and starting materials. Therefore, in-silico optimised two-dimensional HPLC with DryLab® and OpenMS® software has been used to efficiently separate complex seizure samples. An excellent match between simulated and real separations was observed. Targeted separation of model compounds was completed with significantly reduced method development time. This study produced a two-dimensional separation regime that offers unprecedented separation power (separation space) while maintaining a rapid analysis time that is faster than those previously reported for gas chromatography, single dimension high performance liquid chromatography or capillary electrophoresis.

Keywords: chemical fingerprint, ephedrine, methamphetamine, two-dimensional HPLC

Procedia PDF Downloads 452
2675 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 52
2674 Dislocation Density-Based Modeling of the Grain Refinement in Surface Mechanical Attrition Treatment

Authors: Reza Miresmaeili, Asghar Heydari Astaraee, Fereshteh Dolati

Abstract:

In the present study, an analytical model based on dislocation density model was developed to simulate grain refinement in surface mechanical attrition treatment (SMAT). The correlation between SMAT time and development in plastic strain on one hand, and dislocation density evolution, on the other hand, was established to simulate the grain refinement in SMAT. A dislocation density-based constitutive material law was implemented using VUHARD subroutine. A random sequence of shots is taken into consideration for multiple impacts model using Python programming language by utilizing a random function. The simulation technique was to model each impact in a separate run and then transferring the results of each run as initial conditions for the next run (impact). The developed Finite Element (FE) model of multiple impacts describes the coverage evolution in SMAT. Simulations were run to coverage levels as high as 4500%. It is shown that the coverage implemented in the FE model is equal to the experimental coverage. It is depicted that numerical SMAT coverage parameter is adequately conforming to the well-known Avrami model. Comparison between numerical results and experimental measurements for residual stresses and depth of deformation layers confirms the performance of the established FE model for surface engineering evaluations in SMA treatment. X-ray diffraction (XRD) studies of grain refinement, including resultant grain size and dislocation density, were conducted to validate the established model. The full width at half-maximum in XRD profiles can be used to measure the grain size. Numerical results and experimental measurements of grain refinement illustrate good agreement and show the capability of established FE model to predict the gradient microstructure in SMA treatment.

Keywords: dislocation density, grain refinement, severe plastic deformation, simulation, surface mechanical attrition treatment

Procedia PDF Downloads 125
2673 Optimum Design of Helical Gear System on Basis of Maximum Power Transmission Capability

Authors: Yasaman Esfandiari

Abstract:

Mechanical engineering has always dealt with amplification of the input power in power trains. One of the ways to achieve this goal is to use gears to change the amplitude and direction of the torque and the speed. However, the gears should be optimally designed to best achieve these objectives. In this study, helical gear systems are optimized to achieve maximum power. Material selection, space restriction, available facilities for manufacturing, the probability of tooth breakage, and tooth wear are taken into account and governing equations are derived. Finally, a Matlab code was generated to solve the optimization problem and the results are verified.

Keywords: design, gears, Matlab, optimization

Procedia PDF Downloads 234
2672 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer

Authors: Ravinder Bahl, Jamini Sharma

Abstract:

The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.

Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning

Procedia PDF Downloads 350
2671 Personalize E-Learning System Based on Clustering and Sequence Pattern Mining Approach

Authors: H. S. Saini, K. Vijayalakshmi, Rishi Sayal

Abstract:

Network-based education has been growing rapidly in size and quality. Knowledge clustering becomes more important in personalized information retrieval for web-learning. A personalized-Learning service after the learners’ knowledge has been classified with clustering. Through automatic analysis of learners’ behaviors, their partition with similar data level and interests may be discovered so as to produce learners with contents that best match educational needs for collaborative learning. We present a specific mining tool and a recommender engine that we have integrated in the online learning in order to help the teacher to carry out the whole e-learning process. We propose to use sequential pattern mining algorithms to discover the most used path by the students and from this information can recommend links to the new students automatically meanwhile they browse in the course. We have Developed a specific author tool in order to help the teacher to apply all the data mining process. We tend to report on many experiments with real knowledge so as to indicate the quality of using both clustering and sequential pattern mining algorithms together for discovering personalized e-learning systems.

Keywords: e-learning, cluster, personalization, sequence, pattern

Procedia PDF Downloads 415
2670 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 124
2669 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition

Authors: Yalong Jiang, Zheru Chi

Abstract:

In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.

Keywords: CNN, convolutional neural network, capsule network, capacity optimization, character recognition, data augmentation, semantic segmentation

Procedia PDF Downloads 139
2668 Analysis of Vapor-Phase Diffusion of Benzene from Contaminated Soil

Authors: Asma A. Parlin, K. Nakamura, N. Watanabe, T. Komai

Abstract:

Understanding the effective diffusion of benzene vapor in the soil-atmosphere interface is important as an intrusion of benzene into the atmosphere from the soil is largely driven by diffusion. To analyze the vertical one dimensional effective diffusion of benzene vapor in porous medium with high water content, diffusion experiments were conducted in soil columns using Andosol soil and Toyoura silica sand with different water content; for soil water content was from 0 to 30 wt.% and for sand it was from 0.06 to 10 wt.%. In soil, a linear relation was found between water content and effective diffusion coefficient while the effective diffusion coefficient didn’t change in the sand with increasing water. A numerical transport model following unsteady-state approaches based on Fick’s second law was used to match the required time for a steady state of the gas phase concentration profile of benzene to the experimentally measured concentration profile gas phase in the column. The result highlighted that both the water content and porosity might increase vertical diffusion of benzene vapor in soil.

Keywords: benzene vapor-phase, effective diffusion, subsurface soil medium, unsteady state

Procedia PDF Downloads 131
2667 Storage Assignment Strategies to Reduce Manual Picking Errors with an Emphasis on an Ageing Workforce

Authors: Heiko Diefenbach, Christoph H. Glock

Abstract:

Order picking, i.e., the order-based retrieval of items in a warehouse, is an important time- and cost-intensive process for many logistic systems. Despite the ongoing trend of automation, most order picking systems are still manual picker-to-parts systems, where human pickers walk through the warehouse to collect ordered items. Human work in warehouses is not free from errors, and order pickers may at times pick the wrong or the incorrect number of items. Errors can cause additional costs and significant correction efforts. Moreover, age might increase a person’s likelihood to make mistakes. Hence, the negative impact of picking errors might increase for an aging workforce currently witnessed in many regions globally. A significant amount of research has focused on making order picking systems more efficient. Among other factors, storage assignment, i.e., the assignment of items to storage locations (e.g., shelves) within the warehouse, has been subject to optimization. Usually, the objective is to assign items to storage locations such that order picking times are minimized. Surprisingly, there is a lack of research concerned with picking errors and respective prevention approaches. This paper hypothesize that the storage assignment of items can affect the probability of pick errors. For example, storing similar-looking items apart from one other might reduce confusion. Moreover, storing items that are hard to count or require a lot of counting at easy-to-access and easy-to-comprehend self heights might reduce the probability to pick the wrong number of items. Based on this hypothesis, the paper discusses how to incorporate error-prevention measures into mathematical models for storage assignment optimization. Various approaches with respective benefits and shortcomings are presented and mathematically modeled. To investigate the newly developed models further, they are compared to conventional storage assignment strategies in a computational study. The study specifically investigates how the importance of error prevention increases with pickers being more prone to errors due to age, for example. The results suggest that considering error-prevention measures for storage assignment can reduce error probabilities with only minor decreases in picking efficiency. The results might be especially relevant for an aging workforce.

Keywords: an aging workforce, error prevention, order picking, storage assignment

Procedia PDF Downloads 191
2666 A Generalized Model for Performance Analysis of Airborne Radar in Clutter Scenario

Authors: Vinod Kumar Jaysaval, Prateek Agarwal

Abstract:

Performance prediction of airborne radar is a challenging and cumbersome task in clutter scenario for different types of targets. A generalized model requires to predict the performance of Radar for air targets as well as ground moving targets. In this paper, we propose a generalized model to bring out the performance of airborne radar for different Pulsed Repetition Frequency (PRF) as well as different type of targets. The model provides a platform to bring out different subsystem parameters for different applications and performance requirements under different types of clutter terrain.

Keywords: airborne radar, blind zone, clutter, probability of detection

Procedia PDF Downloads 459