Search results for: Automated Rack Supported Warehouse
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2947

Search results for: Automated Rack Supported Warehouse

2467 Transforming Breast Density Measurement with Artificial Intelligence: Population-Level Insights from BreastScreen NSW

Authors: Douglas Dunn, Ricahrd Walton, Matthew Warner-Smith, Chirag Mistry, Kan Ren, David Roder

Abstract:

Introduction: Breast density is a risk factor for breast cancer, both due to increased fibro glandular tissue that can harbor malignancy and the masking of lesions on mammography. Therefore, evaluation of breast density measurement is useful for risk stratification on an individual and population level. This study investigates the performance of Lunit INSIGHT MMG for automated breast density measurement. We analyze the reliability of Lunit compared to breast radiologists, explore density variations across the BreastScreen NSW population, and examine the impact of breast implants on density measurements. Methods: 15,518 mammograms were utilized for a comparative analysis of intra- and inter-reader reliability between Lunit INSIGHT MMG and breast radiologists. Subsequently, Lunit was used to evaluate 624,113 mammograms for investigation of density variations according to age and birth country, providing insights into diverse population subgroups. Finally, we compared breast density in 4,047 clients with implants to clients without implants, controlling for age and birth country. Results: Inter-reader variability between Lunit and Breast Radiologists weighted kappa coefficient was 0.72 (95%CI 0.71-0.73). Highest breast densities were seen in women with a North-East Asia background, whilst those of Aboriginal background had the lowest density. Across all backgrounds, density was demonstrated to reduce with age, though at different rates according to country of birth. Clients with implants had higher density relative to the age-matched no-implant strata. Conclusion: Lunit INSIGHT MMG demonstrates reasonable inter- and intra-observer reliability for automated breast density measurement. The scale of this study is significantly larger than any previous study assessing breast density due to the ability to process large volumes of data using AI. As a result, it provides valuable insights into population-level density variations. Our findings highlight the influence of age, birth country, and breast implants on density, emphasizing the need for personalized risk assessment and screening approaches. The large-scale and diverse nature of this study enhances the generalisability of our results, offering valuable information for breast cancer screening programs internationally.

Keywords: breast cancer, screening, breast density, artificial intelligence, mammography

Procedia PDF Downloads 4
2466 Reducing The Frequency of Flooding Accompanied by Low pH Wastewater In 100/200 Unit of Phosphate Fertilizer 1 Plant by Implementing The 3R Program (Reduce, Reuse and Recycle)

Authors: Pradipta Risang Ratna Sambawa, Driya Herseta, Mahendra Fajri Nugraha

Abstract:

In 2020, PT Petrokimia Gresik implemented a program to increase the ROP (Run Of Pile) production rate at the Phosphate Fertilizer 1 plant, causing an increase in scrubbing water consumption in the 100/200 area unit. This increase in water consumption causes a higher discharge of wastewater, which can further cause local flooding, especially during the rainy season. The 100/200 area of the Phosphate Fertilizer 1 plant is close to the warehouse and is often a passing area for trucks transporting raw materials. This causes the pH in the wastewater to become acidic (the worst point is up to pH 1). The problem of flooding and exposure to acidic wastewater in the 100/200 area of Phosphate Fertilizer Plant 1 was then resolved by PT Petrokimia Gresik through wastewater optimization steps called the 3R program (Reduce, Reuse, and Recycle). The 3R (Reduce, reuse, and recycle) program consists of an air consumption reduction program by considering the liquid/gas ratio in scrubbing unit of 100/200 Phosphate Fertilizer 1 plant, creating a wastewater interconnection line so that wastewater from unit 100/200 can be used as scrubbing water in the Phonska 1, Phonska 2, Phonska 3 and unit 300 Phosphate Fertilizer 1 plant and increasing scrubbing effectiveness through scrubbing effectiveness simulations. Through a series of wastewater optimization programs, PT Petrokimia Gresik has succeeded in reducing NaOH consumption for neutralization up to 2,880 kg/day or equivalent in saving up to 314,359.76 dollars/year and reducing process water consumption up to 600 m3/day or equivalent in saving up to 63,739.62 dollars/year.

Keywords: fertilizer, phosphate fertilizer, wastewater, wastewater treatment, water management

Procedia PDF Downloads 26
2465 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach

Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes

Abstract:

In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.

Keywords: banking institutions, experimental approach, money laundering, risk assessment

Procedia PDF Downloads 267
2464 Designing Electronic Kanban in Assembly Line Tailboom at XYZ Corp to Reducing Lead Time

Authors: Nadhifah A. Nugraha, Dida D. Damayanti, Widia Juliani

Abstract:

Airplanes manufacturing is growing along with the increasing demand from consumers. The helicopter's tail called Tailboom is a product of the helicopter division at XYZ Corp, where the Tailboom assembly line is a pull system. Based on observations of existing conditions that occur at XYZ Corp, production is still unable to meet the demands of consumers; lead time occurs greater than the plan agreed upon by the consumers. In the assembly process, each work station experiences a lack of parts and components needed to assemble components. This happens because of the delay in getting the required part information, and there is no warning about the availability of parts needed, it makes some parts unavailable in assembly warehouse. The lack of parts and components from the previous work station causes the assembly process to stop, and the assembly line also stops at the next station. In its completion, the production time was late and not on the schedule. In resolving these problems, the controlling process is needed, which is controlling the assembly line to get all components and subassembly in the right amount and at the right time. This study applies one of Just In Time tools, namely Kanban and automation, should be added as efficiently and effectively communication line becomes electronic Kanban. The problem can be solved by reducing non-value added time, such as waiting time and idle time. The proposed results of controlling the assembly line of Tailboom result in a smooth assembly line without waiting, reduced lead time, and achieving production time according to the schedule agreed with the consumers.

Keywords: kanban, e-Kanban, lead time, pull system

Procedia PDF Downloads 114
2463 Temporal Focus Scale: Examination of the Reliability and Validity in Japanese Adolescents and Young Adults

Authors: Yuta Chishima, Tatsuya Murakami, Michael McKay

Abstract:

Temporal focus is described as one component of an individual’s time perspective and defined as the attention individuals devote to thinking about the past, present, and future. It affects how people incorporate perceptions about past experiences, current situations, and future expectations into their attitudes, cognitions, and behavior. The 12-item Temporal Focus Scale (TFS) is comprised of three-factors (past, current and future focus). The purpose of this study was to examine the reliability and validity of TFS scores in Japanese adolescents and young adults. The TFS was translated into Japanese by a professional translator, and the original author confirmed the back translated items. Study 1 involved 979 Japanese university students aged 18-25 years old in a questionnaire-based study. The hypothesized three-factor structure (with reliability) was confirmed, although there were problems with item 10. Internal consistency estimates for scores without item 10 were over .70, and test-retest reliability was also adequate. To verify the concurrent and convergent validity, we tested the relationship between TFS scores and life satisfaction, time perspective, self-esteem, and career efficacy. Results of correlational analyses supported our hypotheses. Specifically, future focus was strongly correlated to career efficacy, while past and current focus was not. Study 2 involved 1030 Japanese junior and junior high school students aged 12-18 years old in a questionnaire-based study, and results of multigroup analyses supported the age invariance of the TFS.

Keywords: Japanese, reliability, scale, temporal focus, validity

Procedia PDF Downloads 355
2462 Automated Localization of Palpebral Conjunctiva and Hemoglobin Determination Using Smart Phone Camera

Authors: Faraz Tahir, M. Usman Akram, Albab Ahmad Khan, Mujahid Abbass, Ahmad Tariq, Nuzhat Qaiser

Abstract:

The objective of this study was to evaluate the Degree of anemia by taking the picture of the palpebral conjunctiva using Smartphone Camera. We have first localized the region of interest from the image and then extracted certain features from that Region of interest and trained SVM classifier on those features and then, as a result, our system classifies the image in real-time on their level of hemoglobin. The proposed system has given an accuracy of 70%. We have trained our classifier on a locally gathered dataset of 30 patients.

Keywords: anemia, palpebral conjunctiva, SVM, smartphone

Procedia PDF Downloads 506
2461 Cost Benefit Analysis: Evaluation among the Millimetre Wavebands and SHF Bands of Small Cell 5G Networks

Authors: Emanuel Teixeira, Anderson Ramos, Marisa Lourenço, Fernando J. Velez, Jon M. Peha

Abstract:

This article discusses the benefit cost analysis aspects of millimetre wavebands (mmWaves) and Super High Frequency (SHF). The devaluation along the distance of the carrier-to-noise-plus-interference ratio with the coverage distance is assessed by considering two different path loss models, the two-slope urban micro Line-of-Sight (UMiLoS) for the SHF band and the modified Friis propagation model, for frequencies above 24 GHz. The equivalent supported throughput is estimated at the 5.62, 28, 38, 60 and 73 GHz frequency bands and the influence of carrier-to-noise-plus-interference ratio in the radio and network optimization process is explored. Mostly owing to the lessening caused by the behaviour of the two-slope propagation model for SHF band, the supported throughput at this band is higher than at the millimetre wavebands only for the longest cell lengths. The benefit cost analysis of these pico-cellular networks was analysed for regular cellular topologies, by considering the unlicensed spectrum. For shortest distances, we can distinguish an optimal of the revenue in percentage terms for values of the cell length, R ≈ 10 m for the millimeter wavebands and for longest distances an optimal of the revenue can be observed at R ≈ 550 m for the 5.62 GHz. It is possible to observe that, for the 5.62 GHz band, the profit is slightly inferior than for millimetre wavebands, for the shortest Rs, and starts to increase for cell lengths approximately equal to the ratio between the break-point distance and the co-channel reuse factor, achieving a maximum for values of R approximately equal to 550 m.

Keywords: millimetre wavebands, SHF band, SINR, cost benefit analysis, 5G

Procedia PDF Downloads 141
2460 Causes of Blindness and Low Vision among Visually Impaired Population Supported by Welfare Organization in Ardabil Province in Iran

Authors: Mohammad Maeiyat, Ali Maeiyat Ivatlou, Rasul Fani Khiavi, Abouzar Maeiyat Ivatlou, Parya Maeiyat

Abstract:

Purpose: Considering the fact that visual impairment is still one of the countries health problem, this study was conducted to determine the causes of blindness and low vision in visually impaired membership of Ardabil Province welfare organization. Methods: The present study which was based on descriptive and national-census, that carried out in visually impaired population supported by welfare organization in all urban and rural areas of Ardabil Province in 2013 and Collection of samples lasted for 7 months. The subjects were inspected by optometrist to determine their visual status (blindness or low vision) and then referred to ophthalmologist in order to discover the main causes of visual impairment based on the international classification of diseases version 10. Statistical analysis of collected data was performed using SPSS software version 18. Results: Overall, 403 subjects with mean age of years participated in this study. 73.2% were blind, 26.8 % were low vision and according gender grouping 60.50 % of them were male, 39.50 % were female that divided into three groups with the age level of lower than 15 (11.2%) 15 to 49 (76.7%), and 50 and higher (12.1%). The age range was 1 to 78 years. The causes of blindness and low vision were in descending order: optic atrophy (18.4%), retinitis pigmentosa (16.8%), corneal diseases (12.4%), chorioretinal diseases (9.4%), cataract (8.9%), glaucoma (8.2%), phthisis bulbi (7.2%), degenerative myopia (6.9%), microphtalmos ( 4%), amblyopia (3.2%), albinism (2.5%) and nistagmus (2%). Conclusion: in this study the main causes of visual impairments were optic atrophy and retinitis pigmentosa, thus specific prevention plans can be effective in reducing the incidence of visual disabilities.

Keywords: blindness, low vision, welfare, ardabil

Procedia PDF Downloads 440
2459 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 449
2458 Zeolite Supported Iron-Sensitized TIO₂ for Tetracycline Photocatalytic ‎Degradation under Visible Light: A Comparison between Doping and Ion ‎Exchange ‎

Authors: Ghadeer Jalloul, Nour Hijazi, Cassia Boyadjian, Hussein Awala, Mohammad N. Ahmad, ‎Ahmad Albadarin

Abstract:

In this study, we applied Fe-sensitized TiO₂ supported over embryonic Beta zeolite (BEA) zeolite ‎for the photocatalytic degradation of Tetracycline (TC) antibiotic under visible light. Four different ‎samples having 20, 40, 60, and 100% w/w as a ratio of TiO₂/BEA were prepared. The ‎immobilization of solgel TiO₂ (33 m²/g) over BEA (390 m²/g) increased its surface area to (227 ‎m²/g) and enhanced its adsorption capacity from 8% to 19%. To expand the activity of TiO₂ ‎photocatalyst towards the visible light region (λ>380 nm), we explored two different metal ‎sensitization techniques with Iron ions (Fe³⁺). In the ion-exchange method, the substitutional cations ‎in the zeolite in TiO₂/BEA were exchanged with (Fe³⁺) in an aqueous solution of FeCl₃. In the ‎doping technique, solgel TiO₂ was doped with (Fe³⁺) from FeCl₃ precursor during its synthesis and ‎before its immobilization over BEA. (Fe-TiO₂/BEA) catalysts were characterized using SEM, XRD, ‎BET, UV-VIS DRS, and FTIR. After testing the performance of the various ion-exchanged catalysts ‎under blue and white lights, only (Fe-TiO₂/BEA 60%) showed better activity as compared to pure ‎TiO₂ under white light with 100 ppm initial catalyst concentration and 20 ppm TC concentration. As ‎compared to ion-exchanged (Fe-TiO₂/BEA), doped (Fe-TiO₂/BEA) resulted in higher photocatalytic ‎efficiencies under blue and white lights. The 3%-Fe-doped TiO₂/BEA removed 92% of TC ‎compared to 54% by TiO₂ under white light. The catalysts were also tested under real solar ‎irradiations. This improvement in the photocatalytic performance of TiO₂ was due to its higher ‎adsorption capacity due to BEA support combined with the presence of Iron ions that enhance the ‎visible light absorption and minimize the recombination effect by the charge carriers. ‎

Keywords: Tetracycline, photocatalytic degradation, immobilized TiO₂, zeolite, iron-doped TiO₂, ion-exchange

Procedia PDF Downloads 108
2457 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis

Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame

Abstract:

Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.

Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain

Procedia PDF Downloads 87
2456 Optimum Design of Dual-Purpose Outriggers in Tall Buildings

Authors: Jiwon Park, Jihae Hur, Kukjae Kim, Hansoo Kim

Abstract:

In this study, outriggers, which are horizontal structures connecting a building core to distant columns to increase the lateral stiffness of a tall building, are used to reduce differential axial shortening in a tall building. Therefore, the outriggers in tall buildings are used to serve the dual purposes of reducing the lateral displacement and reducing the differential axial shortening. Since the location of the outrigger greatly affects the effectiveness of the outrigger in terms of the lateral displacement at the top of the tall building and the maximum differential axial shortening, the optimum locations of the dual-purpose outriggers can be determined by an optimization method. Because the floors where the outriggers are installed are given as integer numbers, the conventional gradient-based optimization methods cannot be directly used. In this study, a piecewise quadratic interpolation method is used to resolve the integrality requirement posed by the optimum locations of the dual-purpose outriggers. The optimal solutions for the dual-purpose outriggers are searched by linear scalarization which is a popular method for multi-objective optimization problems. It was found that increasing the number of outriggers reduced the maximum lateral displacement and the maximum differential axial shortening. It was also noted that the optimum locations for reducing the lateral displacement and reducing the differential axial shortening were different. Acknowledgment: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science and ICT (NRF-2017R1A2B4010043) and financially supported by Korea Ministry of Land, Infrastructure and Transport(MOLIT) as U-City Master and Doctor Course Grant Program.

Keywords: concrete structure, optimization, outrigger, tall building

Procedia PDF Downloads 177
2455 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 129
2454 Automated, Objective Assessment of Pilot Performance in Simulated Environment

Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt

Abstract:

Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).

Keywords: automated assessment, flight simulator, human factors, pilot training

Procedia PDF Downloads 150
2453 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

Authors: Loke Mun Sei

Abstract:

Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.

Keywords: test automation tools, test case, test execution, test reporting

Procedia PDF Downloads 583
2452 Measuring Fluctuating Asymmetry in Human Faces Using High-Density 3D Surface Scans

Authors: O. Ekrami, P. Claes, S. Van Dongen

Abstract:

Fluctuating asymmetry (FA) has been studied for many years as an indicator of developmental stability or ‘genetic quality’ based on the assumption that perfect symmetry is ideally the expected outcome for a bilateral organism. Further studies have also investigated the possible link between FA and attractiveness or levels of masculinity or femininity. These hypotheses have been mostly examined using 2D images, and the structure of interest is usually presented using a limited number of landmarks. Such methods have the downside of simplifying and reducing the dimensionality of the structure, which will in return increase the error of the analysis. In an attempt to reach more conclusive and accurate results, in this study we have used high-resolution 3D scans of human faces and have developed an algorithm to measure and localize FA, taking a spatially-dense approach. A symmetric spatially dense anthropometric mask with paired vertices is non-rigidly mapped on target faces using an Iterative Closest Point (ICP) registration algorithm. A set of 19 manually indicated landmarks were used to examine the precision of our mapping step. The protocol’s accuracy in measurement and localizing FA is assessed using simulated faces with known amounts of asymmetry added to them. The results of validation of our approach show that the algorithm is perfectly capable of locating and measuring FA in 3D simulated faces. With the use of such algorithm, the additional captured information on asymmetry can be used to improve the studies of FA as an indicator of fitness or attractiveness. This algorithm can especially be of great benefit in studies of high number of subjects due to its automated and time-efficient nature. Additionally, taking a spatially dense approach provides us with information about the locality of FA, which is impossible to obtain using conventional methods. It also enables us to analyze the asymmetry of a morphological structures in a multivariate manner; This can be achieved by using methods such as Principal Components Analysis (PCA) or Factor Analysis, which can be a step towards understanding the underlying processes of asymmetry. This method can also be used in combination with genome wide association studies to help unravel the genetic bases of FA. To conclude, we introduced an algorithm to study and analyze asymmetry in human faces, with the possibility of extending the application to other morphological structures, in an automated, accurate and multi-variate framework.

Keywords: developmental stability, fluctuating asymmetry, morphometrics, 3D image processing

Procedia PDF Downloads 140
2451 Structure-Activity Relationship of Gold Catalysts on Alumina Supported Cu-Ce Oxides for CO and Volatile Organic Compound Oxidation

Authors: Tatyana T. Tabakova, Elitsa N. Kolentsova, Dimitar Y. Dimitrov, Krasimir I. Ivanov, Yordanka G. Karakirova, Petya Cv. Petrova, Georgi V. Avdeev

Abstract:

The catalytic oxidation of CO and volatile organic compounds (VOCs) is considered as one of the most efficient ways to reduce harmful emissions from various chemical industries. The effectiveness of gold-based catalysts for many reactions of environmental significance was proven during the past three decades. The aim of this work was to combine the favorable features of Au and Cu-Ce mixed oxides in the design of new catalytic materials of improved efficiency and economic viability for removal of air pollutants in waste gases from formaldehyde production. Supported oxides of copper and cerium with Cu: Ce molar ratio 2:1 and 1:5 were prepared by wet impregnation of g-alumina. Gold (2 wt.%) catalysts were synthesized by a deposition-precipitation method. Catalysts characterization was carried out by texture measurements, powder X-ray diffraction, temperature programmed reduction and electron paramagnetic resonance spectroscopy. The catalytic activity in the oxidation of CO, CH3OH and (CH3)2O was measured using continuous flow equipment with fixed bed reactor. Both Cu-Ce/alumina samples demonstrated similar catalytic behavior. The addition of gold caused significant enhancement of CO and methanol oxidation activity (100 % degree of CO and CH3OH conversion at about 60 and 140 oC, respectively). The composition of Cu-Ce mixed oxides affected the performance of gold-based samples considerably. Gold catalyst on Cu-Ce/γ-Al2O3 1:5 exhibited higher activity for CO and CH3OH oxidation in comparison with Au on Cu-Ce/γ-Al2O3 2:1. The better performance of Au/Cu-Ce 1:5 was related to the availability of highly dispersed gold particles and copper oxide clusters in close contact with ceria.

Keywords: CO and VOCs oxidation, copper oxide, Ceria, gold catalysts

Procedia PDF Downloads 319
2450 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 146
2449 Counting People Utilizing Space-Time Imagery

Authors: Ahmed Elmarhomy, K. Terada

Abstract:

An automated method for counting passerby has been proposed using virtual-vertical measurement lines. Space-time image is representing the human regions which are treated using the segmentation process. Different color space has been used to perform the template matching. A proper template matching has been achieved to determine direction and speed of passing people. Distinguish one or two passersby has been investigated using a correlation between passerby speed and the human-pixel area. Finally, the effectiveness of the presented method has been experimentally verified.

Keywords: counting people, measurement line, space-time image, segmentation, template matching

Procedia PDF Downloads 452
2448 Creating Systems Change: Implementing Cross-Sector Initiatives within the Justice System to Support Ontarians with Mental Health and Addictions Needs

Authors: Tania Breton, Dorina Simeonov, Shauna MacEachern

Abstract:

Ontario’s 10 Year Mental Health and Addictions Strategy has included the establishment of 18 Service Collaborative across the province; cross-sector tables in a specific region coming together to explore mental health and addiction system needs and adopting an intervention to address that need. The process is community led and supported by implementation teams from the Centre for Addiction and Mental Health (CAMH), using the framework of implementation science (IS) to enable evidence-based and sustained change. These justice initiatives are focused on the intersection of the justice system and the mental health and addiction systems. In this presentation, we will share the learnings, achievements and challenges of implementing innovative practices to the mental health and addictions needs of Ontarians within the justice system. Specifically, we will focus on the key points across the justice system - from early intervention and trauma-informed, culturally appropriate services to post-sentence support and community reintegration. Our approach to this work involves external implementation support from the CAMH team including coaching, knowledge exchange, evaluation, Aboriginal engagement and health equity expertise. Agencies supported the implementation of tools and processes which changed practice at the local level. These practices are being scaled up across Ontario and community agencies have come together in an unprecedented collaboration and there is a shared vision of the issues overlapping between the mental health, addictions and justice systems. Working with ministry partners has allowed space for innovation and created an environment where better approaches can be nurtured and spread.

Keywords: implementation, innovation, early identification, mental health and addictions, prevention, systems

Procedia PDF Downloads 363
2447 Density Based Traffic System Using Pic Microcontroller

Authors: Tatipamula Samiksha Goud, .A.Naveena, M.sresta

Abstract:

Traffic congestion is a major issue in many cities throughout the world, particularly in urban areas, and it is past time to switch from a fixed timer mode to an automated system. The current traffic signalling system is a fixed-time system that is inefficient if one lane is more functional than the others. A structure for an intelligent traffic control system is being designed to address this issue. When traffic density is higher on one side of a junction, the signal's green time is extended in comparison to the regular time. This study suggests a technique in which the signal's time duration is assigned based on the amount of traffic present at the time. Infrared sensors can be used to do this.

Keywords: infrared sensors, micro-controllers, LEDs, oscillators

Procedia PDF Downloads 142
2446 Smart Irrigation Systems and Website: Based Platform for Farmer Welfare

Authors: Anusha Jain, Santosh Vishwanathan, Praveen K. Gupta, Shwetha S., Kavitha S. N.

Abstract:

Agriculture has a major impact on the Indian economy, with the highest employment ratio than any sector of the country. Currently, most of the traditional agricultural practices and farming methods are manual, which results in farmers not realizing their maximum productivity often due to increasing in labour cost, inefficient use of water sources leading to wastage of water, inadequate soil moisture content, subsequently leading to food insecurity of the country. This research paper aims to solve this problem by developing a full-fledged web application-based platform that has the capacity to associate itself with a Microcontroller-based Automated Irrigation System which schedules the irrigation of crops based on real-time soil moisture content employing soil moisture sensors centric to the crop’s requirements using WSN (Wireless Sensor Networks) and M2M (Machine To Machine Communication) concepts, thus optimizing the use of the available limited water resource, thereby maximizing the crop yield. This robust automated irrigation system provides end-to-end automation of Irrigation of crops at any circumstances such as droughts, irregular rainfall patterns, extreme weather conditions, etc. This platform will also be capable of achieving a nationwide united farming community and ensuring the welfare of farmers. This platform is designed to equip farmers with prerequisite knowledge on tech and the latest farming practices in general. In order to achieve this, the MailChimp mailing service is used through which interested farmers/individuals' email id will be recorded and curated articles on innovations in the world of agriculture will be provided to the farmers via e-mail. In this proposed system, service is enabled on the platform where nearby crop vendors will be able to enter their pickup locations, accepted prices and other relevant information. This will enable farmers to choose their vendors wisely. Along with this, we have created a blogging service that will enable farmers and agricultural enthusiasts to share experiences, helpful knowledge, hardships, etc., with the entire farming community. These are some of the many features that the platform has to offer.

Keywords: WSN (wireless sensor networks), M2M (M/C to M/C communication), automation, irrigation system, sustainability, SAAS (software as a service), soil moisture sensor

Procedia PDF Downloads 129
2445 A Method for Automated Planning of Fiber to the Home Access Network Infrastructures

Authors: Hammad Khalid

Abstract:

In this paper, a strategy for computerized arranging of Fiber to the Home (FTTH) get to systems is proposed. We presented an efficient methodology for arranging access organize framework. The GIS information and a lot of calculations were utilized to make the arranging procedure increasingly programmed. The technique clarifies various strides of the arranging process. Considering various situations, various designs can be produced by utilizing the technique. It was likewise conceivable to produce the designs in an extremely brief temporal contrast with the conventional arranging. A contextual investigation is considered to delineate the utilization and abilities of the arranging technique. The technique, be that as it may, doesn't completely robotize the arranging however, make the arranging procedure fundamentally quick. The outcomes and dialog are displayed and end is given at last.

Keywords: FTTH, GIS, robotize, plan

Procedia PDF Downloads 153
2444 Multi-Agent Approach for Monitoring and Control of Biotechnological Processes

Authors: Ivanka Valova

Abstract:

This paper is aimed at using a multi-agent approach to monitor and diagnose a biotechnological system in order to validate certain control actions depending on the process development and the operating conditions. A multi-agent system is defined as a network of interacting software modules that collectively solve complex tasks. Remote monitoring and control of biotechnological processes is a necessity when automated and reliable systems operating with no interruption of certain activities are required. The advantage of our approach is in its flexibility, modularity and the possibility of improving by acquiring functionalities through the integration of artificial intelligence.

Keywords: multi-agent approach, artificial intelligence, biotechnological processes, anaerobic biodegradation

Procedia PDF Downloads 87
2443 Grid Pattern Recognition and Suppression in Computed Radiographic Images

Authors: Igor Belykh

Abstract:

Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.

Keywords: grid, computed radiography, pattern recognition, image processing, filtering

Procedia PDF Downloads 283
2442 A Comparative Study of Medical Image Segmentation Methods for Tumor Detection

Authors: Mayssa Bensalah, Atef Boujelben, Mouna Baklouti, Mohamed Abid

Abstract:

Image segmentation has a fundamental role in analysis and interpretation for many applications. The automated segmentation of organs and tissues throughout the body using computed imaging has been rapidly increasing. Indeed, it represents one of the most important parts of clinical diagnostic tools. In this paper, we discuss a thorough literature review of recent methods of tumour segmentation from medical images which are briefly explained with the recent contribution of various researchers. This study was followed by comparing these methods in order to define new directions to develop and improve the performance of the segmentation of the tumour area from medical images.

Keywords: features extraction, image segmentation, medical images, tumor detection

Procedia PDF Downloads 168
2441 Analyzing the Ergonomic Design of Manual Material Handling in Chemical Industry: Case Study of Activity Task Weigh Liquid Catalyst to the Container Storage

Authors: Yayan Harry Yadi, L. Meily Kurniawidjaja

Abstract:

Work activities for MMH (Manual Material Handling) in the storage of liquid catalyst raw material workstations in chemical industries identify high-risk MSDs (Musculoskeletal Disorders). Their work is often performed frequently requires an awkward body posture, twisting, bending because of physical space limited, cold, slippery, and limited tools for transfer container and weighing the liquid chemistry of the catalyst into the container. This study aims to develop an ergonomic work system design on the transfer and weighing process of liquid catalyst raw materials at the storage warehouse. A triangulation method through an interview, observation, and detail study team with assessing the level of risk work posture and complaints. Work postures were analyzed using the RULA method, through the support of CATIA software. The study concludes that ergonomic design can make reduce 3 levels of risk scores awkward posture. CATIA Software simulation provided a comprehensive solution for a better posture of manual material handling at task weigh. An addition of manual material handling tools such as adjustable conveyors, trolley and modification tools semi-mechanical weighing with techniques based on rule ergonomic design can reduce the hazard of chemical fluid spills.

Keywords: ergonomic design, MSDs, CATIA software, RULA, chemical industry

Procedia PDF Downloads 164
2440 The Consumption of Sodium and Fat from Processed Foods

Authors: Pil Kyoo Jo, Jee Young Kim, Yu Jin Oh, Sohyun Park, Young Ha Joo, Hye Suk Kim, Semi Kang

Abstract:

When convenience drives daily food choices, the increased consumption of processed foods may be associated with the increased intakes of sodium and fat and further with the onset of chronic diseases. The purpose of this study was to investigate the levels of sodium, saturated fat, and calories intakes through processed foods and the dietary patterns among adult populations in South Korea. We used the nationally representative data from the 5th Korea National Health and Nutrition Examination Survey (KNHANES, 2010-2012) and a cross-sectional survey on the eating behaviors among university students(N=893, 380 men, 513 women) aged from 20 to 24 years. Results showed that South Koreans consumed 43.5% of their total food consumption from processed foods. The 24-hour recalls data showed that 77% of sodium, 60% of fats, 59% of saturated fat, and 44% of calories were consumed from processed food. The intake of processed foods increased by 1.7% in average since 2008 annually. Only 33% of processed food that respondents consumed had nutrition labeling. The data from university students showed that students selected processed foods in convenience store when eating alone compared to eating with someone else. Given the convenience and lack of time, more people will consume processed foods and it may impact their overall dietary intake and further their health. In order to help people to make healthier food choices, regulations and policies to reduce the potentially unhealthy nutrients of processed foods should be strengthened. This research was supported by the National Research Foundation of Korea for 2011 Korea-Japan Basic Scientific Cooperation Program. This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2015S1A5B6037369).

Keywords: sodium, fat, processed foods, diet trends

Procedia PDF Downloads 255
2439 Development and Validation of Employee Trust Scale: Factor Structure, Reliability and Validity

Authors: Chua Bee Seok, Getrude Cosmas, Jasmine Adela Mutang, Shazia Iqbal Hashmi

Abstract:

The aims of this study were to determine the factor structure and psychometric properties (i.e., reliability and convergent validity) of the employees trust scale, a newly created instrument by the researchers. The employees trust scale initially contained 82 items to measure employee’s trust toward their supervisors. A sample of 818 (343 females, 449 males) employees were selected randomly from public and private organization sectors in Kota Kinabalu, Sabah, Malaysia. Their ages ranged from 19 to 67 years old with the mean of 34.55 years old. Their average tenure with their current employer was 11.2 years (s.d. = 7.5 years). The respondents were asked to complete the employees trust scale, as well as a managerial trust questionnaire from Mishra. The exploratory factor analysis on employee’s trust toward their supervisor’s extracted three factors, labeled 'trustworthiness' (32 items), 'position status' (11 items) and 'relationship' (6 items) which accounted for 62.49% of the total variance. Trustworthiness factors were re-categorized into three sub factors: competency (11 items), benevolence (8 items) and integrity (13 items). All factors and sub factors of the scales demonstrated clear reliability with internal consistency of Cronbach’s Alpha above 0.85. The convergent validity of the Scale was supported by an expected pattern of correlations (positive and significant correlation) between the score of all factors and sub factors of the scale and the score on the managerial trust questionnaire which measured the same construct. The convergent validity of employees trust scale was further supported by the significant and positive inter correlation between the factors and sub factors of the scale. The results suggest that the employees trust scale is a reliable and valid measure. However, further studies need to be carried out in other groups of sample as to further validate the Scale.

Keywords: employees trust scale, psychometric properties, trustworthiness, position status, relationship

Procedia PDF Downloads 470
2438 A Study of Issues and Mitigations on Distributed Denial of Service and Medical Internet of Things Devices

Authors: Robin Singh, Jing-Chiou Liou

Abstract:

The Internet of Things (IoT) devices are being used heavily as part of our everyday routines. Through improved communication and automated procedures, its popularity has assisted users in raising the quality of work. These devices are used in healthcare in order to better collect the patient’s data for their treatment. They are generally considered safe and secure. However, there is some possibility that some loopholes do exist which manufacturers do need to identify before some hacker takes advantage of them. For this study, we focused on two medical IoT devices which are pacemakers and hearing aids. The aim of this paper is to identify if there is any likelihood of these medical devices being hijacked and used as a botnet in Distributed Denial-Of Service attacks. Moreover, some mitigation strategies are being proposed to better secure

Keywords: cybersecurity, DDoS, IoT, medical devices

Procedia PDF Downloads 86