Search results for: automated%20teller%20machine
296 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique
Authors: C. Manjula, Lilly Florence
Abstract:
Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.Keywords: decision tree, genetic algorithm, machine learning, software defect prediction
Procedia PDF Downloads 328295 Contactless Attendance System along with Temperature Monitoring
Authors: Nalini C. Iyer, Shraddha H., Anagha B. Varahamurthy, Dikshith C. S., Ishwar G. Kubasad, Vinayak I. Karalatti, Pavan B. Mulimani
Abstract:
The current scenario of the pandemic due to COVID-19 has led to the awareness among the people to avoid unneces-sary contact in public places. There is a need to avoid contact with physical objects to stop the spreading of infection. The contactless feature has to be included in the systems in public places wherever possible. For example, attendance monitoring systems with fingerprint biometric can be replaced with a contactless feature. One more important protocol followed in the current situation is temperature monitoring and screening. The paper describes an attendance system with a contactless feature and temperature screening for the university. The system displays a QR code to scan, which redirects to the student login web page only if the location is valid (the location where the student scans the QR code should be the location of the display of the QR code). Once the student logs in, the temperature of the student is scanned by the contactless temperature sensor (mlx90614) with an error of 0.5°C. If the temperature falls in the range of the desired value (range of normal body temperature), then the attendance of the student is marked as present, stored in the database, and the door opens automatically. The attendance is marked as absent in the other case, alerted with the display of temperature, and the door remains closed. The door is automated with the help of a servomotor. To avoid the proxy, IR sensors are used to count the number of students in the classroom. The hardware system consisting of a contactless temperature sensor and IR sensor is implemented on the microcontroller, NodeMCU.Keywords: NodeMCU, IR sensor, attendance monitoring, contactless, temperature
Procedia PDF Downloads 185294 Computer-Aided Detection of Simultaneous Abdominal Organ CT Images by Iterative Watershed Transform
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
Interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis applications. Segmentation of liver, spleen and kidneys is regarded as a major primary step in the computer-aided diagnosis of abdominal organ diseases. In this paper, a semi-automated method for medical image data is presented for the abdominal organ segmentation data using mathematical morphology. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. Our algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter, we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, simultaneous organ segmentation, the watershed algorithm
Procedia PDF Downloads 438293 Impacts of Applying Automated Vehicle Location Systems to Public Bus Transport Management
Authors: Vani Chintapally
Abstract:
The expansion of modest and minimized Global Positioning System (GPS) beneficiaries has prompted most Automatic Vehicle Location (AVL) frameworks today depending solely on satellite-based finding frameworks, as GPS is the most stable usage of these. This paper shows the attributes of a proposed framework for following and dissecting open transport in a run of the mill medium-sized city and complexities the qualities of such a framework to those of broadly useful AVL frameworks. Particular properties of the courses broke down by the AVL framework utilized for the examination of open transport in our study incorporate cyclic vehicle courses, the requirement for particular execution reports, and so forth. This paper particularly manages vehicle movement forecasts and the estimation of station landing time, combined with consequently produced reports on timetable conformance and other execution measures. Another side of the watched issue is proficient exchange of information from the vehicles to the control focus. The pervasiveness of GSM bundle information exchange advancements combined with decreased information exchange expenses have brought on today's AVL frameworks to depend predominantly on parcel information exchange administrations from portable administrators as the correspondences channel in the middle of vehicles and the control focus. This methodology brings numerous security issues up in this conceivably touchy application field.Keywords: automatic vehicle location (AVL), expectation of landing times, AVL security, data administrations, wise transport frameworks (ITS), guide coordinating
Procedia PDF Downloads 382292 Blockchain Based Hydrogen Market (BBH₂): A Paradigm-Shifting Innovative Solution for Climate-Friendly and Sustainable Structural Change
Authors: Volker Wannack
Abstract:
Regional, national, and international strategies focusing on hydrogen (H₂) and blockchain are driving significant advancements in hydrogen and blockchain technology worldwide. These strategies lay the foundation for the groundbreaking "Blockchain Based Hydrogen Market (BBH₂)" project. The primary goal of this project is to develop a functional Blockchain Minimum Viable Product (B-MVP) for the hydrogen market. The B-MVP will leverage blockchain as an enabling technology with a common database and platform, facilitating secure and automated transactions through smart contracts. This innovation will revolutionize logistics, trading, and transactions within the hydrogen market. The B-MVP has transformative potential across various sectors. It benefits renewable energy producers, surplus energy-based hydrogen producers, hydrogen transport and distribution grid operators, and hydrogen consumers. By implementing standardized, automated, and tamper-proof processes, the B-MVP enhances cost efficiency and enables transparent and traceable transactions. Its key objective is to establish the verifiable integrity of climate-friendly "green" hydrogen by tracing its supply chain from renewable energy producers to end users. This emphasis on transparency and accountability promotes economic, ecological, and social sustainability while fostering a secure and transparent market environment. A notable feature of the B-MVP is its cross-border operability, eliminating the need for country-specific data storage and expanding its global applicability. This flexibility not only broadens its reach but also creates opportunities for long-term job creation through the establishment of a dedicated blockchain operating company. By attracting skilled workers and supporting their training, the B-MVP strengthens the workforce in the growing hydrogen sector. Moreover, it drives the emergence of innovative business models that attract additional company establishments and startups and contributes to long-term job creation. For instance, data evaluation can be utilized to develop customized tariffs and provide demand-oriented network capacities to producers and network operators, benefitting redistributors and end customers with tamper-proof pricing options. The B-MVP not only brings technological and economic advancements but also enhances the visibility of national and international standard-setting efforts. Regions implementing the B-MVP become pioneers in climate-friendly, sustainable, and forward-thinking practices, generating interest beyond their geographic boundaries. Additionally, the B-MVP serves as a catalyst for research and development, facilitating knowledge transfer between universities and companies. This collaborative environment fosters scientific progress, aligns with strategic innovation management, and cultivates an innovation culture within the hydrogen market. Through the integration of blockchain and hydrogen technologies, the B-MVP promotes holistic innovation and contributes to a sustainable future in the hydrogen industry. The implementation process involves evaluating and mapping suitable blockchain technology and architecture, developing and implementing the blockchain, smart contracts, and depositing certificates of origin. It also includes creating interfaces to existing systems such as nomination, portfolio management, trading, and billing systems, testing the scalability of the B-MVP to other markets and user groups, developing data formats for process-relevant data exchange, and conducting field studies to validate the B-MVP. BBH₂ is part of the "Technology Offensive Hydrogen" funding call within the research funding of the Federal Ministry of Economics and Climate Protection in the 7th Energy Research Programme of the Federal Government.Keywords: hydrogen, blockchain, sustainability, innovation, structural change
Procedia PDF Downloads 167291 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 150290 Simulating Elevated Rapid Transit System for Performance Analysis
Authors: Ran Etgar, Yuval Cohen, Erel Avineri
Abstract:
One of the major challenges of transportation in medium sized inner-cities (such as Tel-Aviv) is the last-mile solution. Personal rapid transit (PRT) seems like an applicable candidate for this, as it combines the benefits of personal (car) travel with the operational benefits of transit. However, the investment required for large area PRT grid is significant and there is a need to economically justify such investment by correctly evaluating the grid capacity. PRT main elements are small automated vehicles (sometimes referred to as podcars) operating on a network of specially built guideways. The research is looking at a specific concept of elevated PRT system. Literature review has revealed the drawbacks PRT modelling and simulation approaches, mainly due to the lack of consideration of technical and operational features of the system (such as headways, acceleration, safety issues); the detailed design of infrastructure (guideways, stations, and docks); the stochastic and sessional characteristics of demand; and safety regulations – all of them have a strong effect on the system performance. A highly detailed model of the system, developed in this research, is applying a discrete event simulation combined with an agent-based approach, to represent the system elements and the podecars movement logic. Applying a case study approach, the simulation model is used to study the capacity of the system, the expected throughput of the system, the utilization, and the level of service (journey time, waiting time, etc.).Keywords: capacity, productivity measurement, PRT, simulation, transportation
Procedia PDF Downloads 166289 Design and Optimization of a Mini High Altitude Long Endurance (HALE) Multi-Role Unmanned Aerial Vehicle
Authors: Vishaal Subramanian, Annuatha Vinod Kumar, Santosh Kumar Budankayala, M. Senthil Kumar
Abstract:
This paper discusses the aerodynamic and structural design, simulation and optimization of a mini-High Altitude Long Endurance (HALE) UAV. The applications of this mini HALE UAV vary from aerial topological surveys, quick first aid supply, emergency medical blood transport, search and relief activates to border patrol, surveillance and estimation of forest fire progression. Although classified as a mini UAV according to UVS International, our design is an amalgamation of the features of ‘mini’ and ‘HALE’ categories, combining the light weight of the ‘mini’ and the high altitude ceiling and endurance of the HALE. Designed with the idea of implementation in India, it is in strict compliance with the UAS rules proposed by the office of the Director General of Civil Aviation. The plane can be completely automated or have partial override control and is equipped with an Infra-Red camera and a multi coloured camera with on-board storage or live telemetry, GPS system with Geo Fencing and fail safe measures. An additional of 1.5 kg payload can be attached to three major hard points on the aircraft and can comprise of delicate equipment or releasable payloads. The paper details the design, optimization process and the simulations performed using various software such as Design Foil, XFLR5, Solidworks and Ansys.Keywords: aircraft, endurance, HALE, high altitude, long range, UAV, unmanned aerial vehicle
Procedia PDF Downloads 395288 Comparative Evaluation of Accuracy of Selected Machine Learning Classification Techniques for Diagnosis of Cancer: A Data Mining Approach
Authors: Rajvir Kaur, Jeewani Anupama Ginige
Abstract:
With recent trends in Big Data and advancements in Information and Communication Technologies, the healthcare industry is at the stage of its transition from clinician oriented to technology oriented. Many people around the world die of cancer because the diagnosis of disease was not done at an early stage. Nowadays, the computational methods in the form of Machine Learning (ML) are used to develop automated decision support systems that can diagnose cancer with high confidence in a timely manner. This paper aims to carry out the comparative evaluation of a selected set of ML classifiers on two existing datasets: breast cancer and cervical cancer. The ML classifiers compared in this study are Decision Tree (DT), Support Vector Machine (SVM), k-Nearest Neighbor (k-NN), Logistic Regression, Ensemble (Bagged Tree) and Artificial Neural Networks (ANN). The evaluation is carried out based on standard evaluation metrics Precision (P), Recall (R), F1-score and Accuracy. The experimental results based on the evaluation metrics show that ANN showed the highest-level accuracy (99.4%) when tested with breast cancer dataset. On the other hand, when these ML classifiers are tested with the cervical cancer dataset, Ensemble (Bagged Tree) technique gave better accuracy (93.1%) in comparison to other classifiers.Keywords: artificial neural networks, breast cancer, classifiers, cervical cancer, f-score, machine learning, precision, recall
Procedia PDF Downloads 275287 Assessment of Alteration in High Density Lipo Protein, Apolipoprotein A1, Serum Glutamic Pyruvic Transaminase and Serum Glutamic Oxaloacetic Transaminase in Oral Submucous Fibrosis Patients
Authors: Marina Lazar Chandy, N. Kannan, Rajendra Patil, Vinod Mathew, Ajmal Mohamed, P. K. Sreeja, Renju Jose
Abstract:
Introduction- Arecoline, a major constituent of arecanut has shown to have some effect on liver. The use of arecanut is found to be the most common etiological factor for the development of Oral Submucous fibrosis (O.S.M.F). The effect of arecanut usage on liver in patients with O.S.M.F needs to be assessed. Lipids play a role in structural maintenance of cell. Alterations of lipid profile were noted in cancer patients. O.S.M.F being a precancerous lesion can have some effect on the level of lipids in the body. Objectives: This study was done to assess the alterations in liver enzymes (Serum Glutamic Pyruvic Transaminase(S.G.P.T ,Serum Glutamic Oxaloacetic Transaminase(S.G.O.T)) and lipid metabolism (High Density Lipoprotien(H.D.L) and Apo Lipoprotien A1 (Apo A1)) in patients with O.S.M.F. Methods-130 patients were taken for the study,100 patients with O.S.M.F and 30 as control group without O.S.M.F. Fasting blood sugar levels were taken, centrifuged and analyzed for S.G.P.T,S.G.O.T, H.D.L and Apo A1 using semi automated spectrophotometer. Results: After statistical analysis, it was concluded that there is an elevation of levels of S.G.P.T, S.G.O.T, and decreased levels of H.D.L, Apo A1 for O.S.M.F group when compared with control group. With increased grade of O.S.M.F. and duration of habit, S.G.P.T. & S.G.O.T. increased whereas, H.D.L. & Apo A1 decreased. All the values were statistically significant at p<0.01.Keywords: apolipoprotien A1, high density lipoprotien, oral submucous fibrosis, serum glutamic oxaloacetic transaminase
Procedia PDF Downloads 324286 General Architecture for Automation of Machine Learning Practices
Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain
Abstract:
Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler
Procedia PDF Downloads 54285 Automated Monitoring System to Support Investigation of Contributing Factors of Work-Related Disorders and Accidents
Authors: Erika R. Chambriard, Sandro C. Izidoro, Davidson P. Mendes, Douglas E. V. Pires
Abstract:
Work-related illnesses and disorders have been a constant aspect of work. Although their nature has changed over time, from musculoskeletal disorders to illnesses related to psychosocial aspects of work, its impact on the life of workers remains significant. Despite significant efforts worldwide to protect workers, the disparity between changes in work legislation and actual benefit for workers’ health has been creating a significant economic burden for social security and health systems around the world. In this context, this study aims to propose, test and validate a modular prototype that allows for work environmental aspects to be assessed, monitored and better controlled. The main focus is also to provide a historical record of working conditions and the means for workers to obtain comprehensible and useful information regarding their work environment and legal limits of occupational exposure to different types of environmental variables, as means to improve prevention of work-related accidents and disorders. We show the developed prototype provides useful and accurate information regarding the work environmental conditions, validating them with standard occupational hygiene equipment. We believe the proposed prototype is a cost-effective and adequate approach to work environment monitoring that could help elucidate the links between work and occupational illnesses, and that different industry sectors, as well as developing countries, could benefit from its capabilities.Keywords: Arduino prototyping, occupational health and hygiene, work environment, work-related disorders prevention
Procedia PDF Downloads 124284 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use
Authors: Mayank Mundhra, Chester Rebeiro
Abstract:
Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.Keywords: Ripple, Kelips, unique node list, consensus, information propagation
Procedia PDF Downloads 144283 Investigation of Mechanical Properties of Aluminum Tailor Welded Blanks
Authors: Dario Basile, Manuela De Maddis, Raffaella Sesana, Pasquale Russo Spena, Roberto Maiorano
Abstract:
Nowadays, the reduction of CO₂ emissions and the decrease in energy consumption are the main aims of several industries, especially in the automotive sector. To comply with the increasingly restrictive regulations, the automotive industry is constantly looking for innovative techniques to produce lighter, more efficient, and less polluting vehicles. One of the latest technologies, and still developing, is based on the fabrication of the body-in-white and car parts through the stamping of Aluminum Tailor Welded Blanks. Tailor Welded Blanks (TWBs) are generally the combination of two/three metal sheets with different thicknesses and/or mechanical strengths, which are commonly butt-welded together by laser sources. The use of aluminum TWBs has several advantages such as low density and corrosion resistance adequate. However, their use is still limited by the lower formability with respect to the parent materials and the more intrinsic difficulty of laser welding of aluminum sheets (i.e., internal porosity) that, although its use in automated industries is constantly growing, remains a process to be further developed and improved. This study has investigated the effect of the main laser welding process parameters (laser power, welding speed, and focal distance) on the mechanical properties of aluminum TWBs made of 6xxx series. The research results show that a narrow weldability window can be found to ensure welded joints with high strength and limited or no porosity.Keywords: aluminum sheets, automotive industry, laser welding, mechanical properties, tailor welded blanks
Procedia PDF Downloads 106282 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies
Authors: Tochukwu Timothy Okoli, Devi Datt Tewari
Abstract:
The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking
Procedia PDF Downloads 128281 Newborn Hearing Screening: Experience from a Center in South part of Iran
Authors: Marzieh Amiri, Zahra Iranpour Mobarakeh, Fatemeh Mehrbakhsh, Mehran Amiri
Abstract:
Introduction: Early diagnosis and intervention of congenital hearing loss is necessary to minimize the adverse effects of hearing loss. The aim of the present study was to report the results of newborn hearing screening in a centerin the south part of Iran, Fasa. Material and methods: In this study, the data related to 6,144 newbornsduring September 2018 up to September2021, was analyzed. Hearing screening was performed using transient evoked otoacoustic emissions (TEOAEs) and automated auditory brainstem response (AABR) tests. Results: From all 6144 newborns,3752 and 2392referred to the center from urban and rural part of Fasa, respectively. There were 2958 female and 3186 male in this study. Of 6144 newborns, 6098 ones passed the screening tests, and 46 neonates were referred to a diagnostic audiology clinic. Finally, nine neonates were diagnosed with congenital hearing loss (seven with sensorineural hearing loss and two with conductive hearing loss). The severity of all the hearing impaired neonates was moderate and above. The most important risk factors were family history of hearing loss, low gestational age, NICU hospitalization, and hyperbilirubinemia. Conclusion: Our results showed that the prevalence of hearing loss was 1.46 per 1000 infants. Boosting public knowledge by providing families with proper education appears to be helpful in preventing the negative effects of delayed implementation of health screening programs.Keywords: newborn hearing screening, hearing loss, risk factor, prevalence
Procedia PDF Downloads 161280 Enhanced Model for Risk-Based Assessment of Employee Security with Bring Your Own Device Using Cyber Hygiene
Authors: Saidu I. R., Shittu S. S.
Abstract:
As the trend of personal devices accessing corporate data continues to rise through Bring Your Own Device (BYOD) practices, organizations recognize the potential cost reduction and productivity gains. However, the associated security risks pose a significant threat to these benefits. Often, organizations adopt BYOD environments without fully considering the vulnerabilities introduced by human factors in this context. This study presents an enhanced assessment model that evaluates the security posture of employees in BYOD environments using cyber hygiene principles. The framework assesses users' adherence to best practices and guidelines for maintaining a secure computing environment, employing scales and the Euclidean distance formula. By utilizing this algorithm, the study measures the distance between users' security practices and the organization's optimal security policies. To facilitate user evaluation, a simple and intuitive interface for automated assessment is developed. To validate the effectiveness of the proposed framework, design science research methods are employed, and empirical assessments are conducted using five artifacts to analyze user suitability in BYOD environments. By addressing the human factor vulnerabilities through the assessment of cyber hygiene practices, this study aims to enhance the overall security of BYOD environments and enable organizations to leverage the advantages of this evolving trend while mitigating potential risks.Keywords: security, BYOD, vulnerability, risk, cyber hygiene
Procedia PDF Downloads 74279 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 109278 Effects of Using Alternative Energy Sources and Technologies to Reduce Energy Consumption and Expenditure of a Single Detached House
Authors: Gul Nihal Gugul, Merih Aydinalp-Koksal
Abstract:
In this study, hourly energy consumption model of a single detached house in Ankara, Turkey is developed using ESP-r building energy simulation software. Natural gas is used for space heating, cooking, and domestic water heating in this two story 4500 square feet four-bedroom home. Hourly electricity consumption of the home is monitored by an automated meter reading system, and daily natural gas consumption is recorded by the owners during 2013. Climate data of the region and building envelope data are used to develop the model. The heating energy consumption of the house that is estimated by the ESP-r model is then compared with the actual heating demand to determine the performance of the model. Scenarios are applied to the model to determine the amount of reduction in the total energy consumption of the house. The scenarios are using photovoltaic panels to generate electricity, ground source heat pumps for space heating and solar panels for domestic hot water generation. Alternative scenarios such as improving wall and roof insulations and window glazing are also applied. These scenarios are evaluated based on annual energy, associated CO2 emissions, and fuel expenditure savings. The pay-back periods for each scenario are also calculated to determine best alternative energy source or technology option for this home to reduce annual energy use and CO2 emission.Keywords: ESP-r, building energy simulation, residential energy saving, CO2 reduction
Procedia PDF Downloads 196277 Two Years Retrospective Study of Body Fluid Cultures Obtained from Patients in the Intensive Care Unit of General Hospital of Ioannina
Authors: N. Varsamis, M. Gerasimou, P. Christodoulou, S. Mantzoukis, G. Kolliopoulou, N. Zotos
Abstract:
Purpose: Body fluids (pleural, peritoneal, synovial, pericardial, cerebrospinal) are an important element in the detection of microorganisms. For this reason, it is important to examine them in the Intensive Care Unit (ICU) patients. Material and Method: Body fluids are transported through sterile containers and enriched as soon as possible with Tryptic Soy Broth (TSB). After one day of incubation, the broth is poured into selective media: Blood, Mac Conkey No. 2, Chocolate, Mueller Hinton, Chapman and Saboureaud agar. The above selective media are incubated directly for 2 days. After this period, if any number of microbial colonies are detected, gram staining is performed. After that, the isolated organisms are identified by biochemical techniques in the automated Microscan system (Siemens) and followed by a sensitivity test on the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by Kirby Bauer-based plate test. Results: In 2017 the Laboratory of Microbiology received 60 samples of body fluids from the ICU. More specifically the Microbiology Department received 6 peritoneal fluid specimens, 18 pleural fluid specimens and 36 cerebrospinal fluid specimens. 36 positive cultures were tested. S. epidermidis was identified in 18 specimens, S. haemolyticus in 6, and E. faecium in 12. Conclusions: The results show low detection of microorganisms in body fluid cultures.Keywords: body fluids, culture, intensive care unit, microorganisms
Procedia PDF Downloads 200276 Mastering Test Automation: Bridging Gaps for Seamless QA
Authors: Rohit Khankhoje
Abstract:
The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.Keywords: automation framework, API integration, test automation, test management tools
Procedia PDF Downloads 72275 Non-Destructive Testing of Selective Laser Melting Products
Authors: Luca Collini, Michele Antolotti, Diego Schiavi
Abstract:
At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.Keywords: non-destructive testing, selective laser melting, radiography, UT method
Procedia PDF Downloads 145274 Stimulation of Stevioside Accumulation on Stevia rebaudiana (Bertoni) Shoot Culture Induced with Red LED Light in TIS RITA® Bioreactor System
Authors: Vincent Alexander, Rizkita Esyanti
Abstract:
Leaves of Stevia rebaudiana contain steviol glycoside which mainly comprise of stevioside, a natural sweetener compound that is 100-300 times sweeter than sucrose. Current cultivation method of Stevia rebaudiana in Indonesia has yet to reach its optimum efficiency and productivity to produce stevioside as a safe sugar substitute sweetener for people with diabetes. An alternative method that is not limited by environmental factor is in vitro temporary immersion system (TIS) culture method using recipient for automated immersion (RITA®) bioreactor. The aim of this research was to evaluate the effect of red LED light induction towards shoot growth and stevioside accumulation in TIS RITA® bioreactor system, as an endeavour to increase the secondary metabolite synthesis. The result showed that the stevioside accumulation in TIS RITA® bioreactor system induced with red LED light for one hour during night was higher than that in TIS RITA® bioreactor system without red LED light induction, i.e. 71.04 ± 5.36 μg/g and 42.92 ± 5.40 μg/g respectively. Biomass growth rate reached as high as 0.072 ± 0.015/day for red LED light induced TIS RITA® bioreactor system, whereas TIS RITA® bioreactor system without induction was only 0.046 ± 0.003/day. Productivity of Stevia rebaudiana shoots induced with red LED light was 0.065 g/L medium/day, whilst shoots without any induction was 0.041 g/L medium/day. Sucrose, salt, and inorganic consumption in both bioreactor media increased as biomass increased. It can be concluded that Stevia rebaudiana shoot in TIS RITA® bioreactor induced with red LED light produces biomass and accumulates higher stevioside concentration, in comparison to bioreactor without any light induction.Keywords: LED, Stevia rebaudiana, Stevioside, TIS RITA
Procedia PDF Downloads 370273 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal
Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha
Abstract:
Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit
Procedia PDF Downloads 420272 Indian Business-Papers in Industrial Revolution 4.0: A Paradigm Shift
Authors: Disha Batra
Abstract:
The Industrial Revolution 4.0 is quite different, and a paradigm shift is underway in the media industry. With the advent of automated journalism and social media platforms, newspaper organizations have changed the way news was gathered and reported. The emergence of the fourth industrial revolution in the early 21st century has made the newspapers to adapt the changing technologies to remain relevant. This paper investigates the content of Indian business-papers in the era of the fourth industrial revolution and how these organizations have emerged in the time of convergence. The study is the content analyses of the top three Indian business dailies as per IRS (Indian Readership Survey) 2017 over a decade. The parametric analysis of the different parameters (source of information, use of illustrations, advertisements, layout, and framing, etc.) have been done in order to come across with the distinct adaptations and modifications by these dailies. The paper significantly dwells upon the thematic analysis of these newspapers in order to explore and find out the coverage given to various sub-themes of EBF (economic, business, and financial) journalism. Further, this study reveals the effect of high-speed algorithm-based trading, the aftermath of the fourth industrial revolution on the creative and investigative aspect of delivering financial stories by these respective newspapers. The study indicates a change heading towards an ongoing paradigm shift in the business newspaper industry with an adequate change in the source of information gathering along with the subtle increase in the coverage of financial news stories over the time.Keywords: business-papers, business news, financial news, industrial revolution 4.0.
Procedia PDF Downloads 114271 Measuring the Effect of Co-Composting Oil Sludge with Pig, Cow, Horse And Poultry Manures on the Degradation in Selected Polycyclic Aromatic Hydrocarbons Concentrations
Authors: Ubani Onyedikachi, Atagana Harrison Ifeanyichukwu, Thantsha Mapitsi Silvester
Abstract:
Components of oil sludge (PAHs) are known cytotoxic, mutagenic and potentially carcinogenic compounds also bacteria and fungi have been found to degrade PAHs to innocuous compounds. This study is aimed at measuring the effect of pig, cow, horse and poultry manures on the degradation in selected PAHs present in oil sludge. Soil spiked with oil sludge was co-composted differently with each manure in a ratio of 2:1 (w/w) spiked soil: manure and wood-chips in a ratio of 2:1 (w/v) spiked soil: wood-chips. Control was set up similar as the one above but without manure. The mixtures were incubated for 10 months at room temperature. Compost piles were turned weekly and moisture level was maintained at between 50% and 70%. Moisture level, pH, temperature, CO2 evolution and oxygen consumption were measured monthly and the ash content at the end of experimentation. Highest temperature reached was 27.5 °C in all compost heaps, pH ranged from 5.5 to 7.8 and CO2 evolution was highest in poultry manure at 18.78μg/dwt/day. Microbial growth and activities were enhanced; bacteria identified were Bacillus, Arthrobacter and Staphylococcus species. Percentage reduction in PAHs was measured using automated soxhlet extractor with Dichloromethane coupled with gas chromatography/mass spectrometry (GC/MS). Results from PAH measurements showed reduction between 77% and 99%. Co-composting of spiked soils with animal manures enhanced the reduction in PAHs.Keywords: animal manures, bioremediation, co-composting, oil refinery sludge, PAHs
Procedia PDF Downloads 268270 GA3C for Anomalous Radiation Source Detection
Authors: Chia-Yi Liu, Bo-Bin Xiao, Wen-Bin Lin, Hsiang-Ning Wu, Liang-Hsun Huang
Abstract:
In order to reduce the risk of radiation damage that personnel may suffer during operations in the radiation environment, the use of automated guided vehicles to assist or replace on-site personnel in the radiation environment has become a key technology and has become an important trend. In this paper, we demonstrate our proof of concept for autonomous self-learning radiation source searcher in an unknown environment without a map. The research uses GPU version of Asynchronous Advantage Actor-Critic network (GA3C) of deep reinforcement learning to search for radiation sources. The searcher network, based on GA3C architecture, has self-directed learned and improved how search the anomalous radiation source by training 1 million episodes under three simulation environments. In each episode of training, the radiation source position, the radiation source intensity, starting position, are all set randomly in one simulation environment. The input for searcher network is the fused data from a 2D laser scanner and a RGB-D camera as well as the value of the radiation detector. The output actions are the linear and angular velocities. The searcher network is trained in a simulation environment to accelerate the learning process. The well-performance searcher network is deployed to the real unmanned vehicle, Dashgo E2, which mounts LIDAR of YDLIDAR G4, RGB-D camera of Intel D455, and radiation detector made by Institute of Nuclear Energy Research. In the field experiment, the unmanned vehicle is enable to search out the radiation source of the 18.5MBq Na-22 by itself and avoid obstacles simultaneously without human interference.Keywords: deep reinforcement learning, GA3C, source searching, source detection
Procedia PDF Downloads 113269 Evaluation of the Benefit of Anti-Endomysial IgA and Anti-Tissue Transglutaminase IgA Antibodies for the Diagnosis of Coeliac Disease in a University Hospital, 2010-2016
Authors: Recep Keşli, Onur Türkyılmaz, Hayriye Tokay, Kasım Demir
Abstract:
Objective: Coeliac disease (CD) is a primary small intestine disorder caused by high sensitivity to gluten which is present in the crops, characterized by inflammation in the small intestine mucosa. The goal of this study was to determine and to compare the sensitivity and specificity values of anti-endomysial IgA (EMA IgA) (IFA) and anti-tissue transglutaminase IgA (anti-tTG IgA) (ELISA) antibodies in the diagnosis of patients suspected with the CD. Methods: One thousand two hundred seventy three patients, who have applied to gastroenterology and pediatric disease polyclinics of Afyon Kocatepe University ANS Research and Practice Hospital were included into the study between 23.09.2010 and 30.05.2016. Sera samples were investigated by immunofluorescence method for EMA positiveness (Euroimmun, Luebeck, Germany). In order to determine quantitative value of Anti-tTG IgA (EIA) (Orgentec Mainz, Germany) fully automated ELISA device (Alisei, Seac, Firenze, Italy) were used. Results: Out of 1273 patients, 160 were diagnosed with coeliac disease according to ESPGHAN 2012 diagnosis criteria. Out of 160 CD patients, 120 were female, 40 were male. The EMA specificity and sensitivity were calculated as 98% and 80% respectively. Specificity and sensitivity of Anti-tTG IgA were determined as 99% and 96% respectively. Conclusion: The specificity of EMA for CD was excellent because all EMA-positive patients (n = 144) were diagnosed with CD. The presence of human anti-tTG IgA was found as a reliable marker for diagnosis and follow-up the CD. Diagnosis of CD should be established on both the clinical and serologic profiles together.Keywords: anti-endomysial antibody, anti-tTG IgA, coeliac disease, immunofluorescence assay (IFA)
Procedia PDF Downloads 253268 Feature Analysis of Predictive Maintenance Models
Authors: Zhaoan Wang
Abstract:
Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation
Procedia PDF Downloads 131267 Biomass Enhancement of Stevia (Stevia rebaudiana Bertoni) Shoot Culture in Temporary Immersion System (TIS) RITA® Bioreactor Optimized in Two Different Immersion Periods
Authors: Agustine Melviana, Rizkita Esyanti
Abstract:
Stevia plant contains steviol glycosides which is estimated to be 300 times sweeter than sucrose. However in Indonesia, conventional (in vivo) propagation of Stevia rebaudiana was not effective due to a poor result. Therefore, alternative methods to propagate S. rebaudiana plants is needed, one of it is using in vitro method. Multiplication with a large quantity of stevia biomass in relatively short period can be conducted by using TIS RITA® (Recipient for Automated Temporary Immersion System). The objective of this study was to evaluate the effect of immersion period of the medium on growth and the medium bioconversion into the production of shoot biomass. The study was conducted to determine the effect of different intensity period of medium to enhance biomass of stevia shoots. Shoot culture of S. rebaudiana was grown in full strength MS medium supplemented with 1 ppm Kinetin. RITA® bioreactors were set up with two different immersion periods, 15 min (RITA® 15) and 30 min (RITA® 30), scheduled every 6 hours and incubated for 21 days. The result indicated that immersion period affected the biomass and growth rate (µ). Thirty-minutes immersion showed greater percentage of shoot multiplication (93.44 ± 0.83%), percentage of leaf growth (85.24 ± 5.99%), growth rate (0.042 ± 0.001 g/day), and productivity (0.066 g/L medium/day) compared to that immersed in RITA® 15 min (76.90 ± 4.85%; 79.73 ± 7.76; 0.045 ± 0.004 g/day, and 0.045 g/L medium/day respectively). Enhancement of biomass in RITA® 30 reached 1,702 ± 0,114 gr, whereas in RITA® 15 only 0,953 ± 0,093 gr. Additionally, the pattern of sucrose, mineral, and inorganic compounds consumption followed the growth of plant biomass for both systems. In conclusion, the bioconversion efficiency from medium to biomass in RITA® 30 is better than RITA® 15.Keywords: intensity period, shoot culture, Stevia rebaudiana, TIS RITA®
Procedia PDF Downloads 249