Search results for: reliable
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1901

Search results for: reliable

731 A Framework on Data and Remote Sensing for Humanitarian Logistics

Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini

Abstract:

Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.

Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making

Procedia PDF Downloads 363
730 Assessing Two Protocols for Positive Reinforcement Training in Captive Olive Baboons (Papio anubis)

Authors: H. Cano, P. Ferrer, N. Garcia, M. Popovic, J. Zapata

Abstract:

Positive Reinforcement Training is a well-known methodology which has been reported frequently to be used in captive non-human primates. As a matter of fact, it is an invaluable tool for different purposes related with animal welfare, such as primate husbandry and environmental enrichment. It is also essential to perform some cognitive experiments. The main propose of this pilot study was to establish an efficient protocol to train captive olive baboons (Papio anubis). This protocol seems to be vital in the context of a larger research program in which it will be necessary to train a complete population of around 40 baboons. Baboons were studied at the Veterinary Research Farm of the University of Murcia. Temporally isolated animals were trained to perform three basic tasks. Firstly, they were required to take food prices directly from the researchers’ hands. Then a clicker sound or bridge stimulus was added each time the animal acceded to the reinforcement. Finally, they were trained to touch a target, consisted of a whip with a red ball in its end, with their hands or their nose. When the subject completed correctly this task, it was also exposed to the bridge stimulus and awarded with a food price, such as a portion of banana, orange, apple, peach or a raisin. Two protocols were tested during this experiment. In both of them, there were 6 series of 2min training periods each day. However, in the first protocol, the series consisted in 3 trials, whereas in the second one, in each series there were 5 trials. A reliable performance was obtained with only 6 days of training in the case of the 5-trials protocol. However, with the 3-trials one, 26 days of training were needed. As a result, the 5-trials protocol seems to be more effective than the 3-trials one, in order to teach these three basic tasks to olive baboons. In consequence, it will be used to train the rest of the colony.

Keywords: captive primates, olive baboon, positive reinforcement training, Papio anubis, training

Procedia PDF Downloads 105
729 Molecularly Imprinted Nanoparticles (MIP NPs) as Non-Animal Antibodies Substitutes for Detection of Viruses

Authors: Alessandro Poma, Kal Karim, Sergey Piletsky, Giuseppe Battaglia

Abstract:

The recent increasing emergency threat to public health of infectious influenza diseases has prompted interest in the detection of avian influenza virus (AIV) H5N1 in humans as well as animals. A variety of technologies for diagnosing AIV infection have been developed. However, various disadvantages (costs, lengthy analyses, and need for high-containment facilities) make these methods less than ideal in their practical application. Molecularly Imprinted Polymeric Nanoparticles (MIP NPs) are suitable to overcome these limitations by having high affinity, selectivity, versatility, scalability and cost-effectiveness with the versatility of post-modification (labeling – fluorescent, magnetic, optical) opening the way to the potential introduction of improved diagnostic tests capable of providing rapid differential diagnosis. Here we present our first results in the production and testing of MIP NPs for the detection of AIV H5N1. Recent developments in the solid-phase synthesis of MIP NPs mean that for the first time a reliable supply of ‘soluble’ synthetic antibodies can be made available for testing as potential biological or diagnostic active molecules. The MIP NPs have the potential to detect viruses that are widely circulating in farm animals and indeed humans. Early and accurate identification of the infectious agent will expedite appropriate control measures. Thus, diagnosis at an early stage of infection of a herd or flock or individual maximizes the efficiency with which containment, prevention and possibly treatment strategies can be implemented. More importantly, substantiating the practicability’s of these novel reagents should lead to an initial reduction and eventually to a potential total replacement of animals, both large and small, to raise such specific serological materials.

Keywords: influenza virus, molecular imprinting, nanoparticles, polymers

Procedia PDF Downloads 334
728 Formation of the Investment Portfolio of Intangible Assets with a Wide Pairwise Comparison Matrix Application

Authors: Gulnara Galeeva

Abstract:

The Analytic Hierarchy Process is widely used in the economic and financial studies, including the formation of investment portfolios. In this study, a generalized method of obtaining a vector of priorities for the case with separate pairwise comparisons of the expert opinion being presented as a set of several equal evaluations on a ratio scale is examined. The author claims that this method allows solving an important and up-to-date problem of excluding vagueness and ambiguity of the expert opinion in the decision making theory. The study describes the authentic wide pairwise comparison matrix. Its application in the formation of the efficient investment portfolio of intangible assets of a small business enterprise with limited funding is considered. The proposed method has been successfully approbated on the practical example of a functioning dental clinic. The result of the study confirms that the wide pairwise comparison matrix can be used as a simple and reliable method for forming the enterprise investment policy. Moreover, a comparison between the method based on the wide pairwise comparison matrix and the classical analytic hierarchy process was conducted. The results of the comparative analysis confirm the correctness of the method based on the wide matrix. The application of a wide pairwise comparison matrix also allows to widely use the statistical methods of experimental data processing for obtaining the vector of priorities. A new method is available for simple users. Its application gives about the same accuracy result as that of the classical hierarchy process. Financial directors of small and medium business enterprises get an opportunity to solve the problem of companies’ investments without resorting to services of analytical agencies specializing in such studies.

Keywords: analytic hierarchy process, decision processes, investment portfolio, intangible assets

Procedia PDF Downloads 246
727 Detect Critical Thinking Skill in Written Text Analysis. The Use of Artificial Intelligence in Text Analysis vs Chat/Gpt

Authors: Lucilla Crosta, Anthony Edwards

Abstract:

Companies and the market place nowadays struggle to find employees with adequate skills in relation to anticipated growth of their businesses. At least half of workers will need to undertake some form of up-skilling process in the next five years in order to remain aligned with the requests of the market . In order to meet these challenges, there is a clear need to explore the potential uses of AI (artificial Intelligence) based tools in assessing transversal skills (critical thinking, communication and soft skills of different types in general) of workers and adult students while empowering them to develop those same skills in a reliable trustworthy way. Companies seek workers with key transversal skills that can make a difference between workers now and in the future. However, critical thinking seems to be the one of the most imprtant skill, bringing unexplored ideas and company growth in business contexts. What employers have been reporting since years now, is that this skill is lacking in the majority of workers and adult students, and this is particularly visible trough their writing. This paper investigates how critical thinking and communication skills are currently developed in Higher Education environments through use of AI tools at postgraduate levels. It analyses the use of a branch of AI namely Machine Learning and Big Data and of Neural Network Analysis. It also examines the potential effect the acquisition of these skills through AI tools and what kind of effects this has on employability This paper will draw information from researchers and studies both at national (Italy & UK) and international level in Higher Education. The issues associated with the development and use of one specific AI tool Edulai, will be examined in details. Finally comparisons will be also made between these tools and the more recent phenomenon of Chat GPT and forthcomings and drawbacks will be analysed.

Keywords: critical thinking, artificial intelligence, higher education, soft skills, chat GPT

Procedia PDF Downloads 86
726 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm

Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra

Abstract:

With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.

Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction

Procedia PDF Downloads 104
725 Design of a Surveillance Drone with Computer Aided Durability

Authors: Maram Shahad Dana Anfal

Abstract:

This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions.

Keywords: drone, material, solidwork, hypermesh

Procedia PDF Downloads 119
724 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area

Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz

Abstract:

The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.

Keywords: deterrence function, four-step modelling, origin destination, transport model

Procedia PDF Downloads 153
723 Simplified INS\GPS Integration Algorithm in Land Vehicle Navigation

Authors: Othman Maklouf, Abdunnaser Tresh

Abstract:

Land vehicle navigation is subject of great interest today. Global Positioning System (GPS) is the main navigation system for positioning in such systems. GPS alone is incapable of providing continuous and reliable positioning, because of its inherent dependency on external electromagnetic signals. Inertial Navigation (INS) is the implementation of inertial sensors to determine the position and orientation of a vehicle. The availability of low-cost Micro-Electro-Mechanical-System (MEMS) inertial sensors is now making it feasible to develop INS using an inertial measurement unit (IMU). INS has unbounded error growth since the error accumulates at each step. Usually, GPS and INS are integrated with a loosely coupled scheme. With the development of low-cost, MEMS inertial sensors and GPS technology, integrated INS/GPS systems are beginning to meet the growing demands of lower cost, smaller size, and seamless navigation solutions for land vehicles. Although MEMS inertial sensors are very inexpensive compared to conventional sensors, their cost (especially MEMS gyros) is still not acceptable for many low-end civilian applications (for example, commercial car navigation or personal location systems). An efficient way to reduce the expense of these systems is to reduce the number of gyros and accelerometers, therefore, to use a partial IMU (ParIMU) configuration. For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a field experiment for a low-cost strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach, we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of our low-cost IMU (Inertial Measurement Unit) and because of the relatively small area of the trajectory.

Keywords: GPS, IMU, Kalman filter, materials engineering

Procedia PDF Downloads 403
722 Investigation of Different Machine Learning Algorithms in Large-Scale Land Cover Mapping within the Google Earth Engine

Authors: Amin Naboureh, Ainong Li, Jinhu Bian, Guangbin Lei, Hamid Ebrahimy

Abstract:

Large-scale land cover mapping has become a new challenge in land change and remote sensing field because of involving a big volume of data. Moreover, selecting the right classification method, especially when there are different types of landscapes in the study area is quite difficult. This paper is an attempt to compare the performance of different machine learning (ML) algorithms for generating a land cover map of the China-Central Asia–West Asia Corridor that is considered as one of the main parts of the Belt and Road Initiative project (BRI). The cloud-based Google Earth Engine (GEE) platform was used for generating a land cover map for the study area from Landsat-8 images (2017) by applying three frequently used ML algorithms including random forest (RF), support vector machine (SVM), and artificial neural network (ANN). The selected ML algorithms (RF, SVM, and ANN) were trained and tested using reference data obtained from MODIS yearly land cover product and very high-resolution satellite images. The finding of the study illustrated that among three frequently used ML algorithms, RF with 91% overall accuracy had the best result in producing a land cover map for the China-Central Asia–West Asia Corridor whereas ANN showed the worst result with 85% overall accuracy. The great performance of the GEE in applying different ML algorithms and handling huge volume of remotely sensed data in the present study showed that it could also help the researchers to generate reliable long-term land cover change maps. The finding of this research has great importance for decision-makers and BRI’s authorities in strategic land use planning.

Keywords: land cover, google earth engine, machine learning, remote sensing

Procedia PDF Downloads 104
721 Ensuring Quality in DevOps Culture

Authors: Sagar Jitendra Mahendrakar

Abstract:

Integrating quality assurance (QA) practices into DevOps culture has become increasingly important in modern software development environments. Collaboration, automation and continuous feedback characterize the seamless integration of DevOps development and operations teams to achieve rapid and reliable software delivery. In this context, quality assurance plays a key role in ensuring that software products meet the highest quality, performance and reliability standards throughout the development life cycle. This brief explores key principles, challenges, and best practices related to quality assurance in a DevOps culture. This emphasizes the importance of quality transfer in the development process, as quality control processes are integrated in every step of the DevOps process. Automation is the cornerstone of DevOps quality assurance, enabling continuous testing, integration and deployment and providing rapid feedback for early problem identification and resolution. In addition, the summary addresses the cultural and organizational challenges of implementing quality assurance in DevOps, emphasizing the need to foster collaboration, break down silos, and promote a culture of continuous improvement. It also discusses the importance of toolchain integration and capability development to support effective QA practices in DevOps environments. Moreover, the abstract discusses the cultural and organizational challenges in implementing QA within DevOps, emphasizing the need for fostering collaboration, breaking down silos, and nurturing a culture of continuous improvement. It also addresses the importance of toolchain integration and skills development to support effective QA practices within DevOps environments. Overall, this collection works at the intersection of QA and DevOps culture, providing insights into how organizations can use DevOps principles to improve software quality, accelerate delivery, and meet the changing demands of today's dynamic software. landscape.

Keywords: quality engineer, devops, automation, tool

Procedia PDF Downloads 34
720 Comparison between High Resolution Ultrasonography and Magnetic Resonance Imaging in Assessment of Musculoskeletal Disorders Causing Ankle Pain

Authors: Engy S. El-Kayal, Mohamed M. S. Arafa

Abstract:

There are various causes of ankle pain including traumatic and non-traumatic causes. Various imaging techniques are available for assessment of AP. MRI is considered to be the imaging modality of choice for ankle joint evaluation with an advantage of its high spatial resolution, multiplanar capability, hence its ability to visualize small complex anatomical structures around the ankle. However, the high costs and the relatively limited availability of MRI systems, as well as the relatively long duration of the examination all are considered disadvantages of MRI examination. Therefore there is a need for a more rapid and less expensive examination modality with good diagnostic accuracy to fulfill this gap. HRU has become increasingly important in the assessment of ankle disorders, with advantages of being fast, reliable, of low cost and readily available. US can visualize detailed anatomical structures and assess tendinous and ligamentous integrity. The aim of this study was to compare the diagnostic accuracy of HRU with MRI in the assessment of patients with AP. We included forty patients complaining of AP. All patients were subjected to real-time HRU and MRI of the affected ankle. Results of both techniques were compared to surgical and arthroscopic findings. All patients were examined according to a defined protocol that includes imaging the tendon tears or tendinitis, muscle tears, masses, or fluid collection, ligament sprain or tears, inflammation or fluid effusion within the joint or bursa, bone and cartilage lesions, erosions and osteophytes. Analysis of the results showed that the mean age of patients was 38 years. The study comprised of 24 women (60%) and 16 men (40%). The accuracy of HRU in detecting causes of AP was 85%, while the accuracy of MRI in the detection of causes of AP was 87.5%. In conclusions: HRU and MRI are two complementary tools of investigation with the former will be used as a primary tool of investigation and the latter will be used to confirm the diagnosis and the extent of the lesion especially when surgical interference is planned.

Keywords: ankle pain (AP), high-resolution ultrasound (HRU), magnetic resonance imaging (MRI) ultrasonography (US)

Procedia PDF Downloads 178
719 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 164
718 A Review of Emerging Technologies in Antennas and Phased Arrays for Avionics Systems

Authors: Muhammad Safi, Abdul Manan

Abstract:

In recent years, research in aircraft avionics systems (i.e., radars and antennas) has grown revolutionary. Aircraft technology is experiencing an increasing inclination from all mechanical to all electrical aircraft, with the introduction of inhabitant air vehicles and drone taxis over the last few years. This develops an overriding need to summarize the history, latest trends, and future development in aircraft avionics research for a better understanding and development of new technologies in the domain of avionics systems. This paper focuses on the future trends in antennas and phased arrays for avionics systems. Along with the general overview of the future avionics trend, this work describes the review of around 50 high-quality research papers on aircraft communication systems. Electric-powered aircraft have been a hot topic in the modern aircraft world. Electric aircraft have supremacy over their conventional counterparts. Due to increased drone taxi and urban air mobility, fast and reliable communication is very important, so concepts of Broadband Integrated Digital Avionics Information Exchange Networks (B-IDAIENs) and Modular Avionics are being researched for better communication of future aircraft. A Ku-band phased array antenna based on a modular design can be used in a modular avionics system. Furthermore, integrated avionics is also emerging research in future avionics. The main focus of work in future avionics will be using integrated modular avionics and infra-red phased array antennas, which are discussed in detail in this paper. Other work such as reconfigurable antennas and optical communication, are also discussed in this paper. The future of modern aircraft avionics would be based on integrated modulated avionics and small artificial intelligence-based antennas. Optical and infrared communication will also replace microwave frequencies.

Keywords: AI, avionics systems, communication, electric aircrafts, infra-red, integrated avionics, modular avionics, phased array, reconfigurable antenna, UAVs

Procedia PDF Downloads 54
717 Grey Relational Analysis Coupled with Taguchi Method for Process Parameter Optimization of Friction Stir Welding on 6061 AA

Authors: Eyob Messele Sefene, Atinkut Atinafu Yilma

Abstract:

The highest strength-to-weight ratio criterion has fascinated increasing curiosity in virtually all areas where weight reduction is indispensable. One of the recent advances in manufacturing to achieve this intention endears friction stir welding (FSW). The process is widely used for joining similar and dissimilar non-ferrous materials. In FSW, the mechanical properties of the weld joints are impelled by property-selected process parameters. This paper presents verdicts of optimum process parameters in attempting to attain enhanced mechanical properties of the weld joint. The experiment was conducted on a 5 mm 6061 aluminum alloy sheet. A butt joint configuration was employed. Process parameters, rotational speed, traverse speed or feed rate, axial force, dwell time, tool material and tool profiles were utilized. Process parameters were also optimized, making use of a mixed L18 orthogonal array and the Grey relation analysis method with larger is better quality characteristics. The mechanical properties of the weld joint are examined through the tensile test, hardness test and liquid penetrant test at ambient temperature. ANOVA was conducted in order to investigate the significant process parameters. This research shows that dwell time, rotational speed, tool shape, and traverse speed have become significant, with a joint efficiency of about 82.58%. Nine confirmatory tests are conducted, and the results indicate that the average values of the grey relational grade fall within the 99% confidence interval. Hence the experiment is proven reliable.

Keywords: friction stir welding, optimization, 6061 AA, Taguchi

Procedia PDF Downloads 75
716 Cognitive Behaviour Drama: A Research-Based Intervention Model to Improve Social Thinking in High-Functioning Children with Autism

Authors: Haris Karnezi, Kevin Tierney

Abstract:

Cognitive Behaviour Drama is a research-based intervention model that brought together the science of psychology with the art form of drama to create an unobtrusive and exciting approach that would provide children on the higher end of the autism spectrum the motivation to explore the rules of social interaction and develop competencies associated with communicative success. The method involves engaging the participants in exciting fictional scenarios and encouraging them to seek various solutions on a number of problems that will lead them to an understanding of causal relationships and how a different course of action may lead to a different outcome. The sessions are structured to offer opportunities to the participants to practice target behaviours and understand the functions they serve. The study involved six separate interventions and employed both single case and group designs. Overall 8 children aged between 6 to 13 years, diagnosed with ASD participated in the study. Outcomes were measured using theory of mind tests, executive functioning tests, behavioural observations, pre and post intervention standardised social competence questionnaires for parents and teachers. Collectively, the results indicated positive changes in the self esteem and behaviour of all eight participants. In particular, improvements in the ability to solve theory of mind tasks were noted in the younger group; and qualitative improvements in social communication, in terms of verbal (content) and non verbal expression (body posture, vocal expression, fluency, eye contact, reduction of ritualistic mannerisms) were noted in the older group. The need for reliable impact measures to assess the effectiveness of the model in generating global changes in the participants’ behaviour outside the therapeutic context was identified.

Keywords: autism, drama, intervention, social skills

Procedia PDF Downloads 139
715 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds

Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi

Abstract:

Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.

Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors

Procedia PDF Downloads 260
714 Comparison of Bactec plus Blood Culture Media to BacT/Alert FAN plus Blood Culture Media for Identification of Bacterial Pathogens in Clinical Samples Containing Antibiotics

Authors: Recep Kesli, Huseyin Bilgin, Ela Tasdogan, Ercan Kurtipek

Abstract:

Aim: The aim of this study was to compare resin based Bactec plus aerobic/anaerobic blood culture bottles (Becton Dickinson, MD, USA) and polymeric beads based BacT/Alert FA/FN plus blood culture bottles (bioMerieux, NC, USA) in terms of microorganisms recovery rates and time to detection (TTD) in the patients receiving antibiotic treatment. Method: Blood culture samples were taken from the patients who admitted to the intensive care unit and received antibiotic treatment. Forty milliliters of blood from patients were equally distributed into four types of bottles: Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus. Bactec Plus and BacT/Alert Plus media were compared to culture recovery rates and TTD. Results: Blood culture samples were collected from 382 patients hospitalized in the intensive care unit and 245 patients who were diagnosed as having bloodstream infections were included in the study. A total of 1528 Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus blood culture bottles analyzed and 176, 144, 154, 126 bacteria or fungi were isolated, respectively. Gram-negative and gram-positive bacteria were significantly more frequently isolated in the resin-based Bactec Plus bottles than in the polymeric beads based BacT/Alert Plus bottles. The Bactec Plus and BacT/Alert Plus media recovery rates were similar for fungi and anaerobic bacteria. The mean TTDs in the Bactec Plus bottles were shorter than those in the BacT/Alert Plus bottles regardless of the microorganisms. Conclusion: The results of this study showed that resin-containing media is a reliable and time-saving tool for patients who are receiving antibiotic treatment due to sepsis in the intensive care unit.

Keywords: Bactec Plus, BacT/Alert Plus, blood culture, antibiotic

Procedia PDF Downloads 132
713 Numerical Investigation of Multiphase Flow in Pipelines

Authors: Gozel Judakova, Markus Bause

Abstract:

We present and analyze reliable numerical techniques for simulating complex flow and transport phenomena related to natural gas transportation in pipelines. Such kind of problems are of high interest in the field of petroleum and environmental engineering. Modeling and understanding natural gas flow and transformation processes during transportation is important for the sake of physical realism and the design and operation of pipeline systems. In our approach a two fluid flow model based on a system of coupled hyperbolic conservation laws is considered for describing natural gas flow undergoing hydratization. The accurate numerical approximation of two-phase gas flow remains subject of strong interest in the scientific community. Such hyperbolic problems are characterized by solutions with steep gradients or discontinuities, and their approximation by standard finite element techniques typically gives rise to spurious oscillations and numerical artefacts. Recently, stabilized and discontinuous Galerkin finite element techniques have attracted researchers’ interest. They are highly adapted to the hyperbolic nature of our two-phase flow model. In the presentation a streamline upwind Petrov-Galerkin approach and a discontinuous Galerkin finite element method for the numerical approximation of our flow model of two coupled systems of Euler equations are presented. Then the efficiency and reliability of stabilized continuous and discontinous finite element methods for the approximation is carefully analyzed and the potential of the either classes of numerical schemes is investigated. In particular, standard benchmark problems of two-phase flow like the shock tube problem are used for the comparative numerical study.

Keywords: discontinuous Galerkin method, Euler system, inviscid two-fluid model, streamline upwind Petrov-Galerkin method, twophase flow

Procedia PDF Downloads 309
712 Effect of Concentration Level and Moisture Content on the Detection and Quantification of Nickel in Clay Agricultural Soil in Lebanon

Authors: Layan Moussa, Darine Salam, Samir Mustapha

Abstract:

Heavy metal contamination in agricultural soils in Lebanon poses serious environmental and health problems. Intensive efforts are employed to improve existing quantification methods of heavy metals in contaminated environments since conventional detection techniques have shown to be time-consuming, tedious, and costly. The implication of hyperspectral remote sensing in this field is possible and promising. However, factors impacting the efficiency of hyperspectral imaging in detecting and quantifying heavy metals in agricultural soils were not thoroughly studied. This study proposes to assess the use of hyperspectral imaging for the detection of Ni in agricultural clay soil collected from the Bekaa Valley, a major agricultural area in Lebanon, under different contamination levels and soil moisture content. Soil samples were contaminated with Ni, with concentrations ranging from 150 mg/kg to 4000 mg/kg. On the other hand, soil with background contamination was subjected to increased moisture levels varying from 5 to 75%. Hyperspectral imaging was used to detect and quantify Ni contamination in the soil at different contamination levels and moisture content. IBM SPSS statistical software was used to develop models that predict the concentration of Ni and moisture content in agricultural soil. The models were constructed using linear regression algorithms. The spectral curves obtained reflected an inverse correlation between both Ni concentration and moisture content with respect to reflectance. On the other hand, the models developed resulted in high values of predicted R2 of 0.763 for Ni concentration and 0.854 for moisture content. Those predictions stated that Ni presence was well expressed near 2200 nm and that of moisture was at 1900 nm. The results from this study would allow us to define the potential of using the hyperspectral imaging (HSI) technique as a reliable and cost-effective alternative for heavy metal pollution detection in contaminated soils and soil moisture prediction.

Keywords: heavy metals, hyperspectral imaging, moisture content, soil contamination

Procedia PDF Downloads 79
711 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 118
710 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 65
709 Empathy in the Work of Physiotherapists in Slovakia

Authors: Vladimir Littva, Peter Kutis

Abstract:

Based on common practice, we know that an empathic approach to a patient is one of the characteristics of a physiotherapist. Although empathy is regarded as an essential condition of the psychotherapeutic relationship, it has taken quite a while for attention to be paid to it in clinical practice. Patients who are experiencing a sense of understanding from health care providers are more willing to cooperate, and treatment within the optimistic attunes a more comfortable framework of care. Age, experience, family, education and the working environment may have an impact on the degree of empathy for paramedics. Within the KEGA project no. 003KU-4-2021, we decided to investigate the level of empathy in the work of physiotherapists in Slovakia. Research sample and Methods: The sample comprised 194 respondents – physiotherapists working on the territory of Slovakia. 112 were men and 82 women. The age of respondents was between 21 and 64 years of age. 133 were married, 51 were single and ten were divorced. 98 were living in the countryside and 96 in towns. Twenty-two grew up without siblings, 95 with one sibling and 77 with two and more siblings. In the survey, we used the Empathy Assessment Questionnaire (EAQ) with 18 questions with four possible answers: strongly disagree, disagree, agree; and strongly agree, which we validated linguistically and psychometrically. All data were statistically processed by SPSS 25. Results: We evaluated the intrinsic reliability of the questionnaire EAQ using Cronbach's Alpha and the coefficient is 0.756 in the whole set. This means that the questionnaire is a quite strong and reliable measurement tool. The mean for individual questions ranged from 2.39 to 3.74 (maximum was 4). In Pearson's correlations, we confirmed the significant differences between the groups regarding sex in 8 questions out of 18, regarding age in 5 questions, regarding family status in 4 questions and regarding siblings in 4 questions out of 18 at the level 5% (p <0.05). Conclusion: The results obtained during the research show the importance of adequate communication with the patient due to his health and well-being. Empathy in the physiotherapists’ profession is very important. It would be worthwhile if the students of physiotherapy would receive a course during their study that would deal exclusively with empathy, empathic approach, burnout, or psycho-emotional hygiene.

Keywords: empathy, approach, clinical practice, physiotherapists

Procedia PDF Downloads 174
708 Pretherapy Initial Dosimetry Results in Prostat Cancer Radionuclide Therapy with Lu-177-PSMA-DOTA-617

Authors: M. Abuqebitah, H. Tanyildizi, N. Yeyin, I. Cavdar, M. Demir, L. Kabasakal

Abstract:

Aim: Targeted radionuclide therapy (TRT) is an increasingly used treatment modality for wide range of cancers. Presently dosimetry is highly required either to plan treatment or to ascertain the absorbed dose delivered to critical organs during treatment. Methods and Materials: The study comprised 7 patients suffered from prostate cancer with progressive disease and candidate to undergo Lu-177-DOTA-617 therapy following to PSMA- PET/CT imaging for all patients. (5.2±0.3 mCi) was intravenously injected. To evaluate bone marrow absorbed dose 2 cc blood samples were withdrawn in short variable times (3, 15, 30, 60, 180 minutes) after injection. Furthermore, whole body scans were performed using scintillation gama camera in 4, 24, 48, and 120 hours after injection and in order to quantify the activity taken up in the body, kidneys , liver, right parotid, and left parotid the geometric mean of anterior and posterior counts were determined through ROI analysis, after that background subtraction and attenuation correction were applied using patients PSMA- PET/CT images taking in a consideration: organ thickness, body thickness, and Hounsfield unites from CT scan. OLINDA/EXM dosimetry program was used for curve fitting, residence time calculation, and absorbed dose calculations. Findings: Absorbed doses of bone marrow, left kidney, right kidney, liver, left parotid, right parotid, total body were 1.28±0.52, 32.36±16.36, 32.7±13.68, 10.35±3.45, 38.67±21.29, 37.55±19.77, 2.25±0.95 (mGy/mCi), respectively. Conclusion: Our first results clarify that Lu-177-DOTA-617 is safe and reliable therapy as there were no complications seen. In the other hand, the observable variation in the absorbed dose of the critical organs among the patients necessitate patient-specific dosimetry approach to save body organs and particularly highly exposed kidneys and parotid gland.

Keywords: Lu-177-PSMA, prostate cancer, radionuclide therapy

Procedia PDF Downloads 462
707 Stability Design by Geometrical Nonlinear Analysis Using Equivalent Geometric Imperfections

Authors: S. Fominow, C. Dobert

Abstract:

The present article describes the research that deals with the development of equivalent geometric imperfections for the stability design of steel members considering lateral-torsional buckling. The application of these equivalent imperfections takes into account the stiffness-reducing effects due to inelasticity and residual stresses, which lead to a reduction of the load carrying capacity of slender members and structures. This allows the application of a simplified design method, that is performed in three steps. Application of equivalent geometric imperfections, determination of internal forces using geometrical non-linear analysis (GNIA) and verification of the cross-section resistance at the most unfavourable location. All three verification steps are closely related and influence the results. The derivation of the equivalent imperfections was carried out in several steps. First, reference lateral-torsional buckling resistances for various rolled I-sections, slenderness grades, load shapes and steel grades were determined. This was done either with geometric and material non-linear analysis with geometrical imperfections and residual stresses (GMNIA) or for standard cases based on the equivalent member method. With the aim of obtaining identical lateral-torsional buckling resistances as the reference resistances from the application of the design method, the required sizes for equivalent imperfections were derived. For this purpose, a program based on the FEM method has been developed. Based on these results, several proposals for the specification of equivalent geometric imperfections have been developed. These differ in the shape of the applied equivalent geometric imperfection, the model of the cross-sectional resistance and the steel grade. The proposed design methods allow a wide range of applications and a reliable calculation of the lateral-torsional buckling resistances, as comparisons between the calculated resistances and the reference resistances have shown.

Keywords: equivalent geometric imperfections, GMNIA, lateral-torsional buckling, non-linear finite element analysis

Procedia PDF Downloads 142
706 Testifying in Court as a Victim of Crime for Persons with Little or No Functional Speech: Vocabulary Implications

Authors: Robyn White, Juan Bornman, Ensa Johnson

Abstract:

People with disabilities are at a high risk of becoming victims of crime. Individuals with little or no functional speech (LNFS) face an even higher risk. One way of reducing the risk of remaining a victim of crime is to face the alleged perpetrator in court as a witness – therefore it is important for a person with LNFS who has been a victim of crime to have the required vocabulary to testify in court. The aim of this study was to identify and describe the core and fringe legal vocabulary required by illiterate victims of crime, who have little or no functional speech, to testify in court as witnesses. A mixed-method, the exploratory sequential design consisting of two distinct phases was used to address the aim of the research. The first phase was of a qualitative nature and included two different data sources, namely in-depth semi-structured interviews and focus group discussions. The overall aim of this phase was to identify and describe core and fringe legal vocabulary and to develop a measurement instrument based on these results. Results from Phase 1 were used in Phase 2, the quantitative phase, during which the measurement instrument (a custom-designed questionnaire) was socially validated. The results produced six distinct vocabulary categories that represent the legal core vocabulary and 99 words that represent the legal fringe vocabulary. The findings suggested that communication boards should be individualised to the individual and the specific crime. It is believed that the vocabulary lists developed in this study act as a valid and reliable springboard from which communication boards can be developed. Recommendations were therefore made to develop an Alternative and Augmentative Communication Resource Tool Kit to assist the legal justice system.

Keywords: augmentative and alternative communication, person with little or no functional speech, sexual crimes, testifying in court, victim of crime, witness competency

Procedia PDF Downloads 460
705 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 132
704 Analysis of Rural Roads in Developing Countries Using Principal Component Analysis and Simple Average Technique in the Development of a Road Safety Performance Index

Authors: Muhammad Tufail, Jawad Hussain, Hammad Hussain, Imran Hafeez, Naveed Ahmad

Abstract:

Road safety performance index is a composite index which combines various indicators of road safety into single number. Development of a road safety performance index using appropriate safety performance indicators is essential to enhance road safety. However, a road safety performance index in developing countries has not been given as much priority as needed. The primary objective of this research is to develop a general Road Safety Performance Index (RSPI) for developing countries based on the facility as well as behavior of road user. The secondary objectives include finding the critical inputs in the RSPI and finding the better method of making the index. In this study, the RSPI is developed by selecting four main safety performance indicators i.e., protective system (seat belt, helmet etc.), road (road width, signalized intersections, number of lanes, speed limit), number of pedestrians, and number of vehicles. Data on these four safety performance indicators were collected using observation survey on a 20 km road section of the National Highway N-125 road Taxila, Pakistan. For the development of this composite index, two methods are used: a) Principal Component Analysis (PCA) and b) Equal Weighting (EW) method. PCA is used for extraction, weighting, and linear aggregation of indicators to obtain a single value. An individual index score was calculated for each road section by multiplication of weights and standardized values of each safety performance indicator. However, Simple Average technique was used for weighting and linear aggregation of indicators to develop a RSPI. The road sections are ranked according to RSPI scores using both methods. The two weighting methods are compared, and the PCA method is found to be much more reliable than the Simple Average Technique.

Keywords: indicators, aggregation, principle component analysis, weighting, index score

Procedia PDF Downloads 136
703 21st Century Biotechnological Research and Development Advancements for Industrial Development in India

Authors: Monisha Isaac

Abstract:

Biotechnology is a discipline which explains the use of living organisms and systems to construct a product, or we can define it as an application or technology developed to use biological systems and organisms processes for a specific use. Particularly, it includes cells and its components use for new technologies and inventions. The tools developed can be further used in diverse fields such as agriculture, industry, research and hospitals etc. The 21st century has seen a drastic development and advancement in biotechnology in India. Significant increase in Government of India’s outlays for biotechnology over the past decade has been observed. A sectoral break up of biotechnology-based companies in India shows that most of the companies are agriculture-based companies having interests ranging from tissue culture to biopesticides. Major attention has been given by the companies in health related activities and in environmental biotechnology. The biopharmaceutical, which comprises of vaccines, diagnostic, and recombinant products is the most reliable and largest segment of the Indian Biotech industry. India has developed its vaccine markets and supplies them to various countries. Then there are the bio-services, which mainly comprise of contract researches and manufacturing services. India has made noticeable developments in the field of bio industries including manufacturing of enzymes, biofuels and biopolymers. Biotechnology is also playing a crucial and significant role in the field of agriculture. Traditional methods have been replaced by new technologies that mainly focus on GM crops, marker assisted technologies and the use of biotechnological tools to improve the quality of fertilizers and soil. It may only be a small contributor but has shown to have huge potential for growth. Bioinformatics is a computational method which helps to store, manage, arrange and design tools to interpret the extensive data gathered through experimental trials, making it important in the design of drugs.

Keywords: biotechnology, advancement, agriculture, bio-services, bio-industries, bio-pharmaceuticals

Procedia PDF Downloads 216
702 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model

Authors: Donatella Giuliani

Abstract:

In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.

Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation

Procedia PDF Downloads 202