Search results for: ML techniques
5866 Ultrasonic Evaluation of Periodic Rough Inaccessible Surfaces from Back Side
Authors: Chanh Nghia Nguyen, Yu Kurokawa, Hirotsugu Inoue
Abstract:
The surface roughness is an important parameter for evaluating the quality of material surfaces since it affects functions and performance of industrial components. Although stylus and optical techniques are commonly used for measuring the surface roughness, they are applicable only to accessible surfaces. In practice, surface roughness measurement from the back side is sometimes demanded, for example, in inspection of safety-critical parts such as inner surface of pipes. However, little attention has been paid to the measurement of back surface roughness so far. Since back surface is usually inaccessible by stylus or optical techniques, ultrasonic technique is one of the most effective among others. In this research, an ultrasonic pulse-echo technique is considered for evaluating the pitch and the height of back surface having periodic triangular profile as a very first step. The pitch of the surface profile is measured by applying the diffraction grating theory for oblique incidence; then the height is evaluated by numerical analysis based on the Kirchhoff theory for normal incidence. The validity of the proposed method was verified by both numerical simulation and experiment. It was confirmed that the pitch is accurately measured in most cases. The height was also evaluated with good accuracy when it is smaller than a half of the pitch because of the approximation in the Kirchhoff theory.Keywords: back side, inaccessible surface, periodic roughness, pulse-echo technique, ultrasonic NDE
Procedia PDF Downloads 2745865 Signal Processing Techniques for Adaptive Beamforming with Robustness
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.Keywords: adaptive beamforming, robustness, signal blocking, steering angle error
Procedia PDF Downloads 1225864 Energy Efficient Lighting in Educational Buildings through the Example of a High School in Istanbul
Authors: Nihan Gurel Ulusan
Abstract:
It is obvious that electrical energy, which is an inseparable part of modern day’s human and also the most important power source of our age, should be generated on a level that will suffice the nation’s requirements. The electrical energy used for a sustainable architectural design should be reduced as much as possible. Designing the buildings as energy efficient systems which aim at reducing the artificial illumination loads has been a current subject of our times as a result of concepts gaining importance like conscious consumption of energy sources, environment-friendly designs and sustainability. Reducing the consumption of electrical energy regarding the artificial lighting carries great significance, especially in the volumes which are used all day long like the educational buildings. Starting out with such an aim in this paper, the educational buildings are explored in terms of energy efficient lighting. Firstly, illumination techniques, illumination systems, light sources, luminaries, illumination controls and 'efficient energy' usage in lighting are mentioned. In addition, natural and artificial lighting systems used in educational buildings and also the spaces building up these kind buildings are examined in terms of energy efficient lighting. Lastly, the illumination properties of the school sample chosen for this study, Kağıthane Anadolu Lisesi, a typical high school in Istanbul, is observed. Suggestions are made in order to improve the system by evaluating the illumination properties of the classes with the survey carried out with the users.Keywords: educational buildings, energy efficient, illumination techniques, lighting
Procedia PDF Downloads 2795863 Resilient Machine Learning in the Nuclear Industry: Crack Detection as a Case Study
Authors: Anita Khadka, Gregory Epiphaniou, Carsten Maple
Abstract:
There is a dramatic surge in the adoption of machine learning (ML) techniques in many areas, including the nuclear industry (such as fault diagnosis and fuel management in nuclear power plants), autonomous systems (including self-driving vehicles), space systems (space debris recovery, for example), medical surgery, network intrusion detection, malware detection, to name a few. With the application of learning methods in such diverse domains, artificial intelligence (AI) has become a part of everyday modern human life. To date, the predominant focus has been on developing underpinning ML algorithms that can improve accuracy, while factors such as resiliency and robustness of algorithms have been largely overlooked. If an adversarial attack is able to compromise the learning method or data, the consequences can be fatal, especially but not exclusively in safety-critical applications. In this paper, we present an in-depth analysis of five adversarial attacks and three defence methods on a crack detection ML model. Our analysis shows that it can be dangerous to adopt machine learning techniques in security-critical areas such as the nuclear industry without rigorous testing since they may be vulnerable to adversarial attacks. While common defence methods can effectively defend against different attacks, none of the three considered can provide protection against all five adversarial attacks analysed.Keywords: adversarial machine learning, attacks, defences, nuclear industry, crack detection
Procedia PDF Downloads 1575862 Teaching and Learning Jazz Improvisation Using Bloom's Taxonomy of Learning Domains
Authors: Graham Wood
Abstract:
The 20th Century saw the introduction of many new approaches to music making, including the structured and academic study of jazz improvisation. The rise of many school and tertiary jazz programs was rapid and quickly spread around the globe in a matter of decades. It could be said that the curriculum taught in these new programs was often developed in an ad-hoc manner due to the lack of written literature in this new and rapidly expanding area and the vastly different pedagogical principles when compared to classical music education that was prevalent in school and tertiary programs. There is widespread information regarding the theory and techniques used by jazz improvisers, but methods to practice these concepts in order to achieve the best outcomes for students and teachers is much harder to find. This research project explores the authors’ experiences as a studio jazz piano teacher, ensemble teacher and classroom improvisation lecturer over fifteen years and suggests an alignment with Bloom’s taxonomy of learning domains. This alignment categorizes the different tasks that need to be taught and practiced in order for the teacher and the student to devise a well balanced and effective practice routine and for the teacher to develop an effective teaching program. These techniques have been very useful to the teacher and the student to ensure that a good balance of cognitive, psychomotor and affective skills are taught to the students in a range of learning contexts.Keywords: bloom, education, jazz, learning, music, teaching
Procedia PDF Downloads 2555861 Short Answer Grading Using Multi-Context Features
Authors: S. Sharan Sundar, Nithish B. Moudhgalya, Nidhi Bhandari, Vineeth Vijayaraghavan
Abstract:
Automatic Short Answer Grading is one of the prime applications of artificial intelligence in education. Several approaches involving the utilization of selective handcrafted features, graphical matching techniques, concept identification and mapping, complex deep frameworks, sentence embeddings, etc. have been explored over the years. However, keeping in mind the real-world application of the task, these solutions present a slight overhead in terms of computations and resources in achieving high performances. In this work, a simple and effective solution making use of elemental features based on statistical, linguistic properties, and word-based similarity measures in conjunction with tree-based classifiers and regressors is proposed. The results for classification tasks show improvements ranging from 1%-30%, while the regression task shows a stark improvement of 35%. The authors attribute these improvements to the addition of multiple similarity scores to provide ensemble of scoring criteria to the models. The authors also believe the work could reinstate that classical natural language processing techniques and simple machine learning models can be used to achieve high results for short answer grading.Keywords: artificial intelligence, intelligent systems, natural language processing, text mining
Procedia PDF Downloads 1315860 Histochemical Localization of Hepatitis B Surface Antigen in Hepatocellular Carcinoma: An Evaluation of Two Staining Techniques in a Tertiary Hospital in Calabar, Nigeria
Authors: Imeobong Joseph Inyang, Aniekan-Augusta Okon Eyo, Abel William Essien
Abstract:
Hepatitis B virus (HBV) is one of the known human carcinogens. The presence of HBsAg in liver tissues indicates active viral replication. More than 85% of Hepatocellular Carcinoma (HCC) cases occur in countries with increased rates of chronic HBV infection. An evaluation study to determine the relationship between positivity for HBsAg and development of HCC and its distribution between age and gender of subjects was done. Shikata Orcein and Haematoxylin and Eosin (H&E) staining techniques were performed on liver sections. A total of 50 liver tissue specimens comprising 38 biopsy and 12 post-mortem specimens were processed. Thirty-five of the 50 specimens were positive for HBsAg with Orcein stain whereas only 16 were positive with H&E stain, and these were also positive with Orcein stain, giving an HBsAg prevalence of 70.0% (35/50). The prevalence of HCC in the study was 56.0% (28/50), of which 21 (75.0%) cases were positive for HBsAg, 18 (64.3%) were males while 10 (35.7%) were females distributed within the age range of 20-70 years. The highest number of HBsAg positive HCC cases, 7/21 (33.3%) occurred in the age group 40-49 years. There was no relationship in the pattern of distribution of HCC between age and gender using the Pearson correlation coefficient (r = 0.0474; P < 0.05). HBV infection predisposed to HCC. Orcein technique was more specific and is therefore recommended for screening of liver tissues where facilities for immunohistochemistry are inaccessible.Keywords: Hepatitis B. surface antigen, hepatocellular carcinoma, orcein, pathology
Procedia PDF Downloads 3125859 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region
Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R.M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari
Abstract:
Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool have been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will provide consistent, regular and reliable information regarding natural and anthropogenic ground motion phenomena all over Europe.Keywords: ground displacements, InSAR, natural hazards, satellite imagery
Procedia PDF Downloads 2175858 Cognitive Approach at the Epicenter of Creative Accounting in Cameroonian Companies: The Relevance of the Psycho-Sociological Approach and the Theory of Cognitive Dissonance
Authors: Romuald Temomo Wamba, Robert Wanda
Abstract:
The issue of creative accounting in the psychological and sociological framework has been a mixed subject for over 60 years. The objective of this article is to ensure the existence of creative accounting in Cameroonian entities on the one hand and to understand the strategies used by audit agents to detect errors, omissions, irregularities, or inadequacies in the financial state; optimization techniques used by account preparers to strategically bypass texts on the other hand. To achieve this, we conducted an exploratory study using a cognitive approach, and the data analysis was performed by the software 'decision explorer'. The results obtained challenge the authors' cognition (manifest latent and deceptive behavior). The tax inspectors stress that the entities in Cameroon do not derogate from the rules of piloting in the financial statements. Likewise, they claim a change in current income and net income through depreciation, provisions, inventories, and the spreading of charges over long periods. This suggests the suspicion or intention of manipulating the financial statements. As for the techniques, the account preparers manage the accruals at the end of the year as the basis of the practice of creative accounting. Likewise, management accounts are more favorable to results management.Keywords: creative accounting, sociocognitive approach, psychological and sociological approach, cognitive dissonance theory, cognitive mapping
Procedia PDF Downloads 1905857 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 1315856 Sustainability and Clustering: A Bibliometric Assessment
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner, David Gabriel F. Barros
Abstract:
Review researches are useful in terms of analysis of research problems. Between the types of review documents, we commonly find bibliometric studies. This type of application often helps the global visualization of a research problem and helps academics worldwide to understand the context of a research area better. In this document, a bibliometric view surrounding clustering techniques and sustainability problems is presented. The authors aimed at which issues mostly use clustering techniques, and, even which sustainability issue would be more impactful on today’s moment of research. During the bibliometric analysis, we found ten different groups of research in clustering applications for sustainability issues: Energy; Environmental; Non-urban planning; Sustainable Development; Sustainable Supply Chain; Transport; Urban Planning; Water; Waste Disposal; and, Others. And, by analyzing the citations of each group, we discovered that the Environmental group could be classified as the most impactful research cluster in the area mentioned. Now, after the content analysis of each paper classified in the environmental group, we found that the k-means technique is preferred for solving sustainability problems with clustering methods since it appeared the most amongst the documents. The authors finally conclude that a bibliometric assessment could help indicate a gap of researches on waste disposal – which was the group with the least amount of publications – and the most impactful research on environmental problems.Keywords: bibliometric assessment, clustering, sustainability, territorial partitioning
Procedia PDF Downloads 1075855 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry
Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood
Abstract:
The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.Keywords: ADV, experimental data, multiple Reynolds number, post-processing
Procedia PDF Downloads 1465854 Design of Cylindrical Crawler Robot Inspired by Amoeba Locomotion
Authors: Jun-ya Nagase
Abstract:
Recently, the need of colonoscopy is increasing because of the rise of colonic disorder including cancer of the colon. However, current colonoscopy depends on doctor's skill strongly. Therefore, a large intestine endoscope that does not depend on the techniques of a doctor with high safety is required. In this research, we aim at development a novel large intestine endoscope that can realize safe insertion without specific techniques. A wheel movement type robot, a snake-like robot and an earthworm-like robot are all described in the relevant literature as endoscope robots that are currently studied. Among them, the tracked crawler robot can travel by traversing uneven ground flexibly with a crawler belt attached firmly to the ground surface. Although conventional crawler robots have high efficiency and/or high ground-covering ability, they require a comparatively large space to move. In this study, a small cylindrical crawler robot inspired by amoeba locomotion, which does not need large space to move and which has high ground-covering ability, is proposed. In addition, we developed a prototype of the large intestine endoscope using the proposed crawler mechanism. Experiments have demonstrated smooth operation and a forward movement of the robot by application of voltage to the motor. This paper reports the structure, drive mechanism, prototype, and experimental evaluation.Keywords: tracked-crawler, endoscopic robot, narrow path, amoeba locomotion.
Procedia PDF Downloads 3825853 Incorporation of Copper for Performance Enhancement in Metal-Oxides Resistive Switching Device and Its Potential Electronic Application
Authors: B. Pavan Kumar Reddy, P. Michael Preetam Raj, Souri Banerjee, Souvik Kundu
Abstract:
In this work, the fabrication and characterization of copper-doped zinc oxide (Cu:ZnO) based memristor devices with aluminum (Al) and indium tin oxide (ITO) metal electrodes are reported. The thin films of Cu:ZnO was synthesized using low-cost and low-temperature chemical process. The Cu:ZnO was then deposited onto ITO bottom electrodes using spin-coater technique, whereas the top electrode Al was deposited utilizing physical vapor evaporation technique. Ellipsometer was employed in order to measure the Cu:ZnO thickness and it was found to be 50 nm. Several surface and materials characterization techniques were used to study the thin-film properties of Cu:ZnO. To ascertain the efficacy of Cu:ZnO for memristor applications, electrical characterizations such as current-voltage (I-V), data retention and endurance were obtained, all being the critical parameters for next-generation memory. The I-V characteristic exhibits switching behavior with asymmetrical hysteresis loops. This work imputes the resistance switching to the positional drift of oxygen vacancies associated with respect to the Al/Cu:ZnO junction. Further, a non-linear curve fitting regression techniques were utilized to determine the equivalent circuit for the fabricated Cu:ZnO memristors. Efforts were also devoted in order to establish its potentiality for different electronic applications.Keywords: copper doped, metal-oxides, oxygen vacancies, resistive switching
Procedia PDF Downloads 1615852 Evaluation and Assessment of Bioinformatics Methods and Their Applications
Authors: Fatemeh Nokhodchi Bonab
Abstract:
Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.Keywords: methods, applications, transcriptional regulatory systems, techniques
Procedia PDF Downloads 1255851 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation
Procedia PDF Downloads 3235850 Detecting Earnings Management via Statistical and Neural Networks Techniques
Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie
Abstract:
Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange
Procedia PDF Downloads 4205849 Performance Enhancement of Autopart Manufacturing Industry Using Lean Manufacturing Strategies: A Case Study
Authors: Raman Kumar, Jasgurpreet Singh Chohan, Chander Shekhar Verma
Abstract:
Today, the manufacturing industries respond rapidly to new demands and compete in this continuously changing environment, thus seeking out new methods allowing them to remain competitive and flexible simultaneously. The aim of the manufacturing organizations is to reduce manufacturing costs and wastes through system simplification, organizational potential, and proper infrastructural planning by using modern techniques like lean manufacturing. In India, large number of medium and large scale manufacturing industries has successfully implemented lean manufacturing techniques. Keeping in view the above-mentioned facts, different tools will be involved in the successful implementation of the lean approach. The present work is focused on the auto part manufacturing industry to improve the performance of the recliner assembly line. There is a number of lean manufacturing tools available, but the experience and complete knowledge of manufacturing processes are required to select an appropriate tool for a specific process. Fishbone diagrams (scrap, inventory, and waiting) have been drawn to identify the root cause of different. Effect of cycle time reduction on scrap and inventory is analyzed thoroughly in the case company. Results have shown that there is a decrease in inventory cost by 7 percent after the successful implementation of the lean tool.Keywords: lean tool, fish-bone diagram, cycle time reduction, case study
Procedia PDF Downloads 1265848 Lip Localization Technique for Myanmar Consonants Recognition Based on Lip Movements
Authors: Thein Thein, Kalyar Myo San
Abstract:
Lip reading system is one of the different supportive technologies for hearing impaired, or elderly people or non-native speakers. For normal hearing persons in noisy environments or in conditions where the audio signal is not available, lip reading techniques can be used to increase their understanding of spoken language. Hearing impaired persons have used lip reading techniques as important tools to find out what was said by other people without hearing voice. Thus, visual speech information is important and become active research area. Using visual information from lip movements can improve the accuracy and robustness of a speech recognition system and the need for lip reading system is ever increasing for every language. However, the recognition of lip movement is a difficult task because of the region of interest (ROI) is nonlinear and noisy. Therefore, this paper proposes method to detect the accurate lips shape and to localize lip movement towards automatic lip tracking by using the combination of Otsu global thresholding technique and Moore Neighborhood Tracing Algorithm. Proposed method shows how accurate lip localization and tracking which is useful for speech recognition. In this work of study and experiments will be carried out the automatic lip localizing the lip shape for Myanmar consonants using the only visual information from lip movements which is useful for visual speech of Myanmar languages.Keywords: lip reading, lip localization, lip tracking, Moore neighborhood tracing algorithm
Procedia PDF Downloads 3515847 Reinforcement Learning Optimization: Unraveling Trends and Advancements in Metaheuristic Algorithms
Authors: Rahul Paul, Kedar Nath Das
Abstract:
The field of machine learning (ML) is experiencing rapid development, resulting in a multitude of theoretical advancements and extensive practical implementations across various disciplines. The objective of ML is to facilitate the ability of machines to perform cognitive tasks by leveraging knowledge gained from prior experiences and effectively addressing complex problems, even in situations that deviate from previously encountered instances. Reinforcement Learning (RL) has emerged as a prominent subfield within ML and has gained considerable attention in recent times from researchers. This surge in interest can be attributed to the practical applications of RL, the increasing availability of data, and the rapid advancements in computing power. At the same time, optimization algorithms play a pivotal role in the field of ML and have attracted considerable interest from researchers. A multitude of proposals have been put forth to address optimization problems or improve optimization techniques within the domain of ML. The necessity of a thorough examination and implementation of optimization algorithms within the context of ML is of utmost importance in order to provide guidance for the advancement of research in both optimization and ML. This article provides a comprehensive overview of the application of metaheuristic evolutionary optimization algorithms in conjunction with RL to address a diverse range of scientific challenges. Furthermore, this article delves into the various challenges and unresolved issues pertaining to the optimization of RL models.Keywords: machine learning, reinforcement learning, loss function, evolutionary optimization techniques
Procedia PDF Downloads 725846 Crowdsourced Economic Valuation of the Recreational Benefits of Constructed Wetlands
Authors: Andrea Ghermandi
Abstract:
Constructed wetlands have long been recognized as sources of ancillary benefits such as support for recreational activities. To date, there is a lack of quantitative understanding of the extent and welfare impact of such benefits. Here, it is shown how geotagged, passively crowdsourced data from online social networks (e.g., Flickr and Panoramio) and Geographic Information Systems (GIS) techniques can: (1) be used to infer annual recreational visits to 273 engineered wetlands worldwide; and (2) be integrated with non-market economic valuation techniques (e.g., travel cost method) to infer the monetary value of recreation in these systems. Counts of social media photo-user-days are highly correlated with the number of observed visits in 62 engineered wetlands worldwide (Pearson’s r = 0.811; p-value < 0.001). The estimated, mean willingness to pay for access to 115 wetlands ranges between $5.3 and $374. In 50% of the investigated wetlands providing polishing treatment to advanced municipal wastewater, the present value of such benefits exceeds that of the capital, operation and maintenance costs (lifetime = 45 years; discount rate = 6%), indicating that such systems are sources of net societal benefits even before factoring in benefits derived from water quality improvement and storage. Based on the above results, it is argued that recreational benefits should be taken into account in the design and management of constructed wetlands, as well as when such green infrastructure systems are compared with conventional wastewater treatment solutions.Keywords: constructed wetlands, cultural ecosystem services, ecological engineering, social media
Procedia PDF Downloads 1285845 The Type II Immune Response in Acute and Chronic Pancreatitis Mediated by STAT6 in Murine
Authors: Hager Elsheikh
Abstract:
Context: Pancreatitis is a condition characterized by inflammation in the pancreas, which can lead to serious complications if untreated. Both acute and chronic pancreatitis are associated with immune reactions and fibrosis, which further damage the pancreas. The type 2 immune response, primarily driven by alternative activated macrophages (AAMs), plays a significant role in the development of fibrosis. The IL-4/STAT6 pathway is a crucial signaling pathway for the activation of M2 macrophages. Pancreatic fibrosis is induced by dysregulated inflammatory responses and can result in the autodigestion and necrosis of pancreatic acinar cells. Research Aim: The aim of this study is to investigate the impact of STAT6, a crucial molecule in the IL-4/STAT6 pathway, on the severity and development of fibrosis during acute and chronic pancreatitis. The research also aims to understand the influence of the JAK/STAT6 signaling pathway on the balance between fibrosis and regeneration in the presence of different macrophage populations. Methodology: The research utilizes murine models of acute and chronic pancreatitis induced by cerulean injection. Animal models will be employed to study the effect of STAT6 knockout on disease severity and fibrosis. Isolation of acinar cells and cell culture techniques will be used to assess the impact of different macrophage populations on wound healing and regeneration. Various techniques such as PCR, histology, immunofluorescence, and transcriptomics will be employed to analyze the tissues and cells. Findings: The research aims to provide insights into the mechanisms underlying tissue fibrosis and wound healing during acute and chronic pancreatitis. By investigating the influence of the JAK/STAT6 signaling pathway and different macrophage populations, the study aims to understand their impact on tissue fibrosis, disease severity, and pancreatic regeneration. Theoretical Importance: This research contributes to our understanding of the role of specific signaling pathways, macrophage polarization, and the type 2 immune response in pancreatitis. It provides insights into the molecular mechanisms underlying tissue fibrosis and the potential for targeted therapies. Data Collection and Analysis Procedures: Data will be collected through the use of murine models, isolation and culture of acinar cells, and various experimental techniques such as PCR, histology, immunofluorescence, and transcriptomics. Data will be analyzed using appropriate statistical methods and techniques, and the findings will be interpreted in the context of the research objectives. Conclusion: By investigating the mechanisms of tissue fibrosis and wound healing during acute and chronic pancreatitis, this research aims to enhance our understanding of the disease progression and potential therapeutic targets. The findings have theoretical importance in expanding our knowledge of pancreatic fibrosis and the role of macrophage polarization in the context of the type 2 immune response.Keywords: immunity in chronic diseases, pancreatitis, macrophages, immune response
Procedia PDF Downloads 335844 A Risk Management Framework for Selling a Mega Power Plant Project in a New Market
Authors: Negar Ganjouhaghighi, Amirali Dolatshahi
Abstract:
The origin of most risks of a mega project usually takes place in the phases before closing the contract. As a practical point of view, using project risk management techniques for preparing a proposal is not a total solution for managing the risks of a contract. The objective of this paper is to cover all those activities associated with risk management of a mega project sale’s processes; from entrance to a new market to awarding activities and the review of contract performance. In this study, the risk management happens in six consecutive steps that are divided into three distinct but interdependent phases upstream of the award of the contract: pre-tendering, tendering and closing. In the first step, by preparing standard market risk report, risks of the new market are identified. The next step is the bid or no bid decision making based on the previous gathered data. During the next three steps in tendering phase, project risk management techniques are applied for determining how much contingency reserve must be added or reduced to the estimated cost in order to put the residual risk to an acceptable level. Finally, the last step which happens in closing phase would be an overview of the project risks and final clarification of residual risks. The sales experience of more than 20,000 MW turn-key power plant projects alongside this framework, are used to develop a software that assists the sales team to have a better project risk management.Keywords: project marketing, risk management, tendering, project management, turn-key projects
Procedia PDF Downloads 3275843 Knowledge Management in the Interactive Portal for Decision Makers on InKOM Example
Authors: K. Marciniak, M. Owoc
Abstract:
Managers as decision-makers present in different sectors should be supported in efficient and more and more sophisticated way. There are huge number of software tools developed for such users starting from simple registering data from business area – typical for operational level of management – up to intelligent techniques with delivering knowledge - for tactical and strategic levels of management. There is a big challenge for software developers to create intelligent management dashboards allowing to support different decisions. In more advanced solutions there is even an option for selection of intelligent techniques useful for managers in particular decision-making phase in order to deliver valid knowledge-base. Such a tool (called Intelligent Dashboard for SME Managers–InKOM) is prepared in the Business Intelligent framework of Teta products. The aim of the paper is to present solutions assumed for InKOM concerning on management of stored knowledge bases offering for business managers. The paper is managed as follows. After short introduction concerning research context the discussed supporting managers via information systems the InKOM platform is presented. In the crucial part of paper a process of knowledge transformation and validation is demonstrated. We will focus on potential and real ways of knowledge-bases acquiring, storing and validation. It allows for formulation conclusions interesting from knowledge engineering point of view.Keywords: business intelligence, decision support systems, knowledge management, knowledge transformation, knowledge validation, managerial systems
Procedia PDF Downloads 5105842 Performance Evaluation of Wideband Code Division Multiplication Network
Authors: Osama Abdallah Mohammed Enan, Amin Babiker A/Nabi Mustafa
Abstract:
The aim of this study is to evaluate and analyze different parameters of WCDMA (wideband code division multiplication). Moreover, this study also incorporates brief yet throughout analysis of WCDMA’s components as well as its internal architecture. This study also examines different power controls. These power controls may include open loop power control, closed or inner group loop power control and outer loop power control. Different handover techniques or methods of WCDMA are also illustrated in this study. These handovers may include hard handover, inter system handover and soft and softer handover. Different duplexing techniques are also described in the paper. This study has also presented an idea about different parameters of WCDMA that leads the system towards QoS issues. This may help the operator in designing and developing adequate network configuration. In addition to this, the study has also investigated various parameters including Bit Energy per Noise Spectral Density (Eb/No), Noise rise, and Bit Error Rate (BER). After simulating these parameters, using MATLAB environment, it was investigated that, for a given Eb/No value the system capacity increase by increasing the reuse factor. Besides that, it was also analyzed that, noise rise is decreasing for lower data rates and for lower interference levels. Finally, it was examined that, BER increase by using one type of modulation technique than using other type of modulation technique.Keywords: duplexing, handover, loop power control, WCDMA
Procedia PDF Downloads 2135841 Effect of Weed Control and Different Plant Densities the Yield and Quality of Safflower (Carthamus tinctorius L.)
Authors: Hasan Dalgic, Fikret Akinerdem
Abstract:
This trial was made to determine effect of different plant density and weed control on yield and quality of winter sowing safflower (Carthamus tinctorius L.) in Selcuk University, Agricultural Faculty trial fields and the effective substance of Trifluran was used as herbicide. Field trial was made during the vegetation period of 2009-2010 with three replications according to 'Split Plots in Randomized Blocks' design. The weed control techniques were made on main plots and row distances was set up on sub-plots. The trial subjects were consisting from three weed control techniques as fallowing: herbicide application (Trifluran), hoeing and control beside the row distances of 15 cm and 30 cm. The results were ranged between 59.0-76.73 cm in plant height, 40.00-47.07 cm in first branch height, 5.00-7.20 in number of branch per plant, 6.00-14.73 number of head per plant, 19.57-21.87 mm in head diameter, 2125.0-3968.3 kg ha-1 in seed yield, 27.10-28.08 % in crude oil rate and 531.7-1070.3 kg ha-1. According to the results, Remzibey safflower cultivar showed the highest seed yield on 30 cm of row distance and herbicide application by means of the direct effects of plant height, first branch height, number of branch per plant, number of head per plant, table diameter, crude oil rate and crude oil yield.Keywords: safflower, herbicide, row spacing, seed yield, oil ratio, oil yield
Procedia PDF Downloads 3315840 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification
Authors: Rujia Chen, Ajit Narayanan
Abstract:
Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels
Procedia PDF Downloads 1855839 Development of in vitro Fertilization and Emerging Legal Issues
Authors: Malik Imtiaz Ahmad
Abstract:
The development of In Vitro Fertilization (IVF) has revolutionized the field of reproductive medicine, offering hope to myriad individuals and couples facing infertility issues. IVF, a process involving the fertilization of eggs with sperm outside the body, has evolved over decades from an experimental procedure to a mainstream medical practice. The study sought to understand the evolution of IVF from its early stages to its present status as a groundbreaking fertility treatment. It also aimed to analyze the legal complexities surrounding IVF, including issues like embryo ownership, surrogacy agreements, and custody disputes. This research focused on the multidisciplinary approach involving both medical and legal fields. It aimed to explore the historical evolution of IVF, its techniques, and legal challenges concerning family law, health law, and privacy policies it has given rise to in modern times. This research aimed to provide insights into the intersection of medical technology and the law, offering valuable knowledge for policymakers, legal experts, and individuals involved in IVF. The study utilized various methods, including a thorough literature review, a historical analysis of IVF’s evolution, an examination of legal cases, and a review of emerging regulations. These approaches aimed to provide a comprehensive understanding of IVF and its modern legal issues, facilitating a holistic exploration of the subject matter.Keywords: in vitro fertilization development, IVF techniques evolution, legal issues in IVF, IVF legal frameworks, ethical dilemmas in IVF
Procedia PDF Downloads 345838 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat
Authors: Purba Biswas, Priyanka Dey
Abstract:
Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy
Procedia PDF Downloads 725837 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 545