Search results for: multiple data
26423 Polarization of Glass with Positive and Negative Charge Carriers
Authors: Valentina V. Zhurikhina, Mihail I. Petrov, Alexandra A. Rtischeva, Mark Dussauze, Thierry Cardinal, Andrey A. Lipovskii
Abstract:
Polarization of glass, often referred to as thermal poling, is a well-known method to modify the glass physical and chemical properties, that manifest themselves in loosing central symmetry of the medium, glass structure and refractive index modification. The usage of the poling for second optical harmonic generation, fabrication of optical waveguides and electrooptic modulators was also reported. Nevertheless, the detailed description of the poling of glasses, containing multiple charge carriers is still under discussion. In particular, the role of possible migration of electrons in the space charge formation usually remains out of the question. In this work, we performed the numerical simulation of thermal poling of a silicate glass, containing Na, K, Mg, and Ca. We took into consideration the contribution of electrons in the polarization process. The possible explanation of migration of electrons can be the break of non-bridging oxygen bonds. It was found, that the modeled depth of the space charge region is about 10 times higher if the migration of the negative charges is taken under consideration. The simulated profiles of cations, participating in the polarization process, are in a good agreement with the experimental data, obtained by glow discharge spectroscopy.Keywords: glass poling, charge transport, modeling, concentration profiles
Procedia PDF Downloads 35926422 Factors Influencing Respectful Perinatal Care Among Healthcare Professionals In Low-and Middle-resource Countries: A Systematic Review
Authors: Petronella Lunda, Catharina Susanna Minnie, Welma Lubbe
Abstract:
Background This review aimed to provide healthcare professionals with a scientific summary of the best available research evidence on factors influencing respectful perinatal care. The review question was ‘What were the perceptions of midwives and doctors on factors that influence respectful perinatal care?’ Methods A detailed search was done on electronic databases: EBSCOhost: Medline, OAlster, Scopus, SciELO, Science Direct, PubMed, Psych INFO, and SocINDEX. The databases were searched for available literature using a predetermined search strategy. Reference lists of included studies were analysed to identify studies missing from databases. The phenomenon of interest was factors influencing maternity care practices according to midwives and doctors. Pre-determined inclusion and exclusion criteria were used during the selection of potential studies. In total, 13 studies were included in the data analysis and synthesis. Three themes were identified and a total of nine sub-themes. Results Studies conducted in various settings were included in the study. Multiple factors influencing respectful perinatal care were identified. During data synthesis, three themes emerged: healthcare institution, healthcare professionals, and women-related factors. Alongside the themes were sub-themes human resources, medical supplies, norms and practices, physical infrastructure, healthcare professional competencies and attributes, women’s knowledge, and preferences. The three factors influence the provision of respectful perinatal care; addressing them might improve the provision of the care. Conclusion Addressing factors that influence respectful perinatal care is vital towards the prevention of compromised patient care during the perinatal period as these factors have the potential to accelerate or hinder provision of respectful care.Keywords: doctors, maternity care, midwives, obstetrician, perceptions, perinatal care, respectful care
Procedia PDF Downloads 2226421 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42226420 Contemporary Art of Healing: New Generation of Shamanism Ritual
Authors: Yeaeun Jang
Abstract:
Shamanism, in general, has been steadily reinterpreted as research and art from cult, superstition, mysticism, and historical perspectives. Shamanism has existed throughout the five-thousand-year-old history of Korea, and it still actively is ongoing. It is interesting to observe how this tradition has had a profound impact on its current high-technology society. Many still ask Shamans for pieces of advice rituals for their problems to be solved. Historically, Korean shamanism has a strong connection and many similarities with Mongolian and Eastern Siberian Shamanism. 'God' is 'Nature'. 'Shaman' is a 'Mediator of communication chosen by God' and is a divine being who has entered the mysterious realm by challenging human limitations through harsh training. A shaman in ancient society used to be a leader of a group and entertainer who played various roles; king, counsellor, doctor, singer, dancer, painter, and performer. This artistic research focuses on the Shaman role as an artist with multiple mediums and reconstructing their ancient ritual into multimedia performing art that attempts to deal with traumatic memories in one’s life. This fusion style of contemporary ritual is mainly inspired by ‘Gut(굿)’, Korean Shamanism ritual. This comprehensive art needs several important elements; a shaman, a client, musicians, helpers, and the audience. It is a feast to gather people in a big circle. Nowadays, art has been divided into separate fields and developed, but before, there existed art of Synesthesia, whose boundaries were unclear that were not determined through which medium to express that abstract ideas. Multiple disciplines coexist and harmonise with each other. Studying shamanism ritual as an ancient form of performing art can create a warm, spiritual feast for everyone and remind us about ‘togetherness’.Keywords: healing, multimedia art, performance art, shamanism, spirituality
Procedia PDF Downloads 10026419 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 6426418 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection
Authors: Hongyu Chen, Li Jiang
Abstract:
Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers
Procedia PDF Downloads 12926417 Teacher Professional Development in Saudi Arabia through the Implementation of Universal Design for Learning
Authors: Majed A. Alsalem
Abstract:
Universal Design for Learning (UDL) is common theme in education across the US and an influential model and framework that enables students in general and particularly students who are deaf and hard of hearing (DHH) to access the general education curriculum. UDL helps teachers determine how information will be presented to students and how to keep students engaged. Moreover, UDL helps students to express their understanding and knowledge to others. UDL relies on technology to promote students' interaction with content and their communication of knowledge. This study included 120 DHH students who received daily instruction based on UDL principles. This study presents the results of the study and discusses its implications for the integration of UDL in day-to-day practice as well as in the country's education policy. UDL is a Western concept that began and grew in the US, and it has just begun to transfer to other countries such as Saudi Arabia. It will be very important to researchers, practitioners, and educators to see how UDL is being implemented in a new place with a different culture. UDL is a framework that is built to provide multiple means of engagement, representation, and action and expression that should be part of curricula and lessons for all students. The purpose of this study is to investigate the variables associated with the implementation of UDL in Saudi Arabian schools and identify the barriers that could prevent the implementation of UDL. Therefore, this study used a mixed methods design that use both quantitative and qualitative methods. More insights will be gained by including both quantitative and qualitative rather than using a single method. By having methods that different concepts and approaches, the databases will be enriched. This study uses levels of collecting date through two stages in order to insure that the data comes from multiple ways to mitigate validity threats and establishing trustworthiness in the findings. The rationale and significance of this study is that it will be the first known research that targets UDL in Saudi Arabia. Furthermore, it will deal with UDL in depth to set the path for further studies in the Middle East. From a perspective of content, this study considers teachers’ implementation knowledge, skills, and concerns of implementation. This study deals with effective instructional designs that have not been presented in any conferences, workshops, teacher preparation and professional development programs in Saudi Arabia. Specifically, Saudi Arabian schools are challenged to design inclusive schools and practices as well as to support all students’ academic skills development. The total participants in stage one were 336 teachers of DHH students. The results of the intervention indicated significant differences among teachers before and after taking the training sessions associated with their understanding and level of concern. Teachers have indicated interest in knowing more about UDL and adopting it into their practices; they reported that UDL has benefits that will enhance their performance for supporting student learning.Keywords: deaf and hard of hearing, professional development, Saudi Arabia, universal design for learning
Procedia PDF Downloads 43226416 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37026415 Effects of Transit Fare Discount Programs on Passenger Volumes and Transferring Behaviors
Authors: Guan-Ying Chen, Han-Tsung Liou, Shou-Ren Hu
Abstract:
To address traffic congestion problems and encourage the use of public transportation systems in the Taipei metropolitan area, the Taipei City Government and the New Taipei City Government implemented a monthly ticket policy on April 16, 2018. This policy offers unlimited rides on the Taipei MRT, Taipei City Bus, New Taipei City Bus, Danhai Light Rail, and Public Bike (YouBike) on a monthly basis. Additionally, both city governments replaced the smart card discount policy with a new frequent flyer discount program (referred to as the loyal customer program) on February 1, 2020, introducing a differential pricing policy. Specifically, the more frequently the Taipei MRT system is used, the greater the discounts users receive. To analyze the impact of the Taipei public transport monthly ticket policy and the frequent user discount program on the passenger volume of the Taipei MRT system and the transferring behaviors of MRT users, this study conducts a trip-chain analysis using transaction data from Taipei MRT smart cards between September 2017 and December 2020. To achieve these objectives, the study employs four indicators: 1) number of passengers, 2) average number of rides, 3) average trip distance, and 4) instances of multiple consecutive rides. The study applies the t-test and Mann-Kendall trend test to investigate whether the proposed indicators have changed over time due to the implementation of the discount policy. Furthermore, the study examines the travel behaviors of passengers who use monthly tickets. The empirical results of the study indicate that the implementation of the Taipei public transport monthly ticket policy has led to an increase in the average number of passengers and a reduction in the average trip distance. Moreover, there has been a significant increase in instances of multiple consecutive rides, attributable to the unlimited rides offered by the monthly tickets. The impact of the frequent user discount program on changes in MRT passengers is not as pronounced as that of the Taipei public transportation monthly ticket policy. This is partly due to the fact that the frequent user discount program is only applicable to the Taipei MRT system, and the passenger volume was greatly affected by the COVID-19 pandemic. The findings of this research can serve as a reference for Taipei MRT Corporation in formulating its fare strategy and can also provide guidance for the Taipei and New Taipei City Governments in evaluating differential pricing policies for public transportation systems.Keywords: frequent user discount program, mass rapid transit, monthly ticket, smart card
Procedia PDF Downloads 8326414 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25426413 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27526412 The Mental Workload of Intensive Care Unit Nurses in Performing Human-Machine Tasks: A Cross-Sectional Survey
Authors: Yan Yan, Erhong Sun, Lin Peng, Xuchun Ye
Abstract:
Aims: The present study aimed to explore Intensive Care Unit (ICU) nurses’ mental workload (MWL) and associated factors with it in performing human-machine tasks. Background: A wide range of emerging technologies have penetrated widely in the field of health care, and ICU nurses are facing a dramatic increase in nursing human-machine tasks. However, there is still a paucity of literature reporting on the general MWL of ICU nurses performing human-machine tasks and the associated influencing factors. Methods: A cross-sectional survey was employed. The data was collected from January to February 2021 from 9 tertiary hospitals in 6 provinces (Shanghai, Gansu, Guangdong, Liaoning, Shandong, and Hubei). Two-stage sampling was used to recruit eligible ICU nurses (n=427). The data were collected with an electronic questionnaire comprising sociodemographic characteristics and the measures of MWL, self-efficacy, system usability, and task difficulty. The univariate analysis, two-way analysis of variance (ANOVA), and a linear mixed model were used for data analysis. Results: Overall, the mental workload of ICU nurses in performing human-machine tasks was medium (score 52.04 on a 0-100 scale). Among the typical nursing human-machine tasks selected, the MWL of ICU nurses in completing first aid and life support tasks (‘Using a defibrillator to defibrillate’ and ‘Use of ventilator’) was significantly higher than others (p < .001). And ICU nurses’ MWL in performing human-machine tasks was also associated with age (p = .001), professional title (p = .002), years of working in ICU (p < .001), willingness to study emerging technology actively (p = .006), task difficulty (p < .001), and system usability (p < .001). Conclusion: The MWL of ICU nurses is at a moderate level in the context of a rapid increase in nursing human-machine tasks. However, there are significant differences in MWL when performing different types of human-machine tasks, and MWL can be influenced by a combination of factors. Nursing managers need to develop intervention strategies in multiple ways. Implications for practice: Multidimensional approaches are required to perform human-machine tasks better, including enhancing nurses' willingness to learn emerging technologies actively, developing training strategies that vary with tasks, and identifying obstacles in the process of human-machine system interaction.Keywords: mental workload, nurse, ICU, human-machine, tasks, cross-sectional study, linear mixed model, China
Procedia PDF Downloads 6926411 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 26126410 Determinants of Food Insecurity Among Smallholder Farming Households in Southwest Area of Nigeria
Authors: Adesomoju O. A., E. A. Onemolease, G. O. Igene
Abstract:
The study analyzed the determinants of food insecurity among smallholder farming households in the Southwestern part of Nigeria with Ondo and Osun States in focus. Multi-stage sampling procedures were employed to gather data from 389 farming households (194 from Ondo State and 195 from Osun State) spread across 4 agricultural zones, 8 local governments, and 24 communities. The data was analyzed using descriptive statistics, Ordinal regression, and Friedman test. Results revealed the average age of the respondents was 47 years with majority being male 63.75% and married 82.26% and having an household size of 6. Most household heads were educated (94.09%), engaged in farming for about 19 years, and do not belong to cooperatives (73.26%). Respondents derived income from both farming and non-farm activities with the average farm income being N216,066.8/annum and non-farm income being about N360,000/annum. Multiple technologies were adopted by respondents such as application of herbicides (77.63%), pesticides (73.26%) and fertilizers (66.58%). Using the FANTA Cornel model, food insecurity was prevalent in the study area with the majority (61.44%) of the households being severely food insecure, and 35.73% being moderately food insecure. In comparison, 1.80% and 1.03% were food-secured and mildly food insecure. The most significant constraints to food security among the farming households were the inability to access credit (mean rank = 8.78), poor storage infrastructure (8.57), inadequate capital (8.56), and high cost of farm chemicals (8.35). Significant factors related to food insecurity among the farming households were age (b = -0.059), education (b = -0.376), family size (b = 0.197), adoption of technology (b = -0.198), farm income (b = -0.335), association membership (b = -0.999), engagement in non-farm activities (b = -1.538), and access to credit (b = -0.853). Linking farmers' groups to credit institutions and input suppliers was proposed.Keywords: food insecurity, FANTA Cornel, Ondo, Osun, Nigeria, Southwest, Livelihood
Procedia PDF Downloads 3026409 Impact of Infrastructural Development on Socio-Economic Growth: An Empirical Investigation in India
Authors: Jonardan Koner
Abstract:
The study attempts to find out the impact of infrastructural investment on state economic growth in India. It further tries to determine the magnitude of the impact of infrastructural investment on economic indicator, i.e., per-capita income (PCI) in Indian States. The study uses panel regression technique to measure the impact of infrastructural investment on per-capita income (PCI) in Indian States. Panel regression technique helps incorporate both the cross-section and time-series aspects of the dataset. In order to analyze the difference in impact of the explanatory variables on the explained variables across states, the study uses Fixed Effect Panel Regression Model. The conclusions of the study are that infrastructural investment has a desirable impact on economic development and that the impact is different for different states in India. We analyze time series data (annual frequency) ranging from 1991 to 2010. The study reveals that the infrastructural investment significantly explains the variation of economic indicators.Keywords: infrastructural investment, multiple regression, panel regression techniques, economic development, fixed effect dummy variable model
Procedia PDF Downloads 37126408 Partnering with Stakeholders to Secure Digitization of Water
Authors: Sindhu Govardhan, Kenneth G. Crowther
Abstract:
Modernisation of the water sector is leading to increased connectivity and integration of emerging technologies with traditional ones, leading to new security risks. The convergence of Information Technology (IT) with Operation Technology (OT) results in solutions that are spread across larger geographic areas, increasingly consist of interconnected Industrial Internet of Things (IIOT) devices and software, rely on the integration of legacy with modern technologies, use of complex supply chain components leading to complex architectures and communication paths. The result is that multiple parties collectively own and operate these emergent technologies, threat actors find new paths to exploit, and traditional cybersecurity controls are inadequate. Our approach is to explicitly identify and draw data flows that cross trust boundaries between owners and operators of various aspects of these emerging and interconnected technologies. On these data flows, we layer potential attack vectors to create a frame of reference for evaluating possible risks against connected technologies. Finally, we identify where existing controls, mitigations, and other remediations exist across industry partners (e.g., suppliers, product vendors, integrators, water utilities, and regulators). From these, we are able to understand potential gaps in security, the roles in the supply chain that are most likely to effectively remediate those security gaps, and test cases to evaluate and strengthen security across these partners. This informs a “shared responsibility” solution that recognises that security is multi-layered and requires collaboration to be successful. This shared responsibility security framework improves visibility, understanding, and control across the entire supply chain, and particularly for those water utilities that are accountable for safe and continuous operations.Keywords: cyber security, shared responsibility, IIOT, threat modelling
Procedia PDF Downloads 7726407 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 48926406 Robust Barcode Detection with Synthetic-to-Real Data Augmentation
Authors: Xiaoyan Dai, Hsieh Yisan
Abstract:
Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.Keywords: barcode detection, data augmentation, deep learning, image-based processing
Procedia PDF Downloads 16926405 Impact of Flooding on Food Calorie Intake and Health Outcomes among Small Holder Farm Households in Koton Karfe Local Government Area of Kogi State, Nigeria
Authors: Cornelius Michael Ekenta, Aderonke Bashirat Mohammed, Sefi Ahmed
Abstract:
The research examined the impact of flooding on food calorie intake and health challenges among smallholder farm households in Koton Karfe Local Government Area of Kogi State, Nigeria. Purposive and random sampling techniques were used to select 130 farm households in selected villages in the area. Primary data were generated through the administration of a well-structured questionnaire. Data were analyzed with descriptive statistics, Double Difference Estimator (DDE), Calorie Intake Estimation Function, t-test, and multiple regressions. The result shows that farm households lost an average of 132, 950kg of selected crops amounting to about N20m ($56, 542) loose in income. Food daily calorie intake indicates a loss of an average of 715.18Kcal, showing a significant difference in calorie intake before and after flooding (t = 2.0629) at 5% probability. Furthermore, the health challenges most prevalent during flooding were malaria fever, typhoid fever, cholera, and dysentery. The determinants of daily calorie intake were age, household size, level of income, flooding, health challenges, and food price. The study concluded that flooding had negative impacts on crop output and income, daily food calorie intact, and health challenges of a farm household in the study area. It was recommended that the State Government should make adequate and proper arrangements to relocate residents of the area at the warning of possible flooding by the National Metrological Centre and should, through the State Emergency Management Agency (SEMA), provide relieve items to the residents to cushion the effects of the flooding.Keywords: calorie, cholera, flooding, health challenges, impact
Procedia PDF Downloads 14526404 The Use of Social Media in the Recruitment Process as HR Strategy
Authors: Seema Sant
Abstract:
In the 21st century were four generation workforces are working, it’s crucial for organizations to build talent management strategy, as tech-savvy Gen Y has entered the work force. They are more connected to each other than ever – through the internet enabled Social media networks Social media has become important in today’s world. The users of such Social media sites have increased in multiple. From sharing their opinion for a brand/product to researching a company before going for an interview, making a conception about a company’s culture or following a Company’s updates due to sheer interest or for job vacancy, Work force today is constantly in touch with social networks. Thus corporate world has rightly realized its potential uses for business purpose. Companies now use social media for marketing, advertising, consumer survey, etc. For HR professionals, it is used for networking and connecting to the Talent pool- through Talent Community. Social recruiting is the process of sourcing or hiring candidates through the use of social sites such as LinkedIn, Facebook Twitter which provide them with an array of information about potential employee; this study represents an exploratory investigation on the role of social networking sites in recruitment. The primarily aim is to analyze the factors that can enhance the channel of recruitment used by of the recruiter with specific reference to the IT organizations in Mumbai, India. Particularly, the aim is to identify how and why companies use social media to attract and screen applicants during their recruitment processes. It also examines the advantages and limitations of recruitment through social media for employers. This is done by literature review. Further, the papers examine the recruiter impact and understand the various opportunities which have created due to technology, thus, to analyze and examine these factors, both primary, as well as secondary data, are collected for the study. The primary data are gathered from five HR manager working in five top IT organizations in Mumbai and 100 HR consultants’ i.e., recruiter. The data was collected by conducting a survey and supplying a closed-ended questionnaire. A comprehension analysis of the study is depicted through graphs and figures. From the analysis, it was observed that there exists a positive relationship between the level of employee recruited through social media and their organizational commitment. Finally the findings show that company’s i.e. recruiters are currently using social media in recruitment, but perhaps not as effective as they could be. The paper gives recommendations and conditions for success that can help employers to make the most out of social media in recruitment.Keywords: recruitment, social media, social sites, workforce
Procedia PDF Downloads 17926403 Intentional Relationship Building: Stem Faculty Perceptions of Culturally Responsive Mentoring
Authors: Niesha Douglas, Lisa Merriweather, Cathy Howell, Anna Sancyzk
Abstract:
Many studies explain that mentoring in an academic setting contributes to student success and retention. However, in the United States, where the population is diverse and filled with multiple ethnic groups, mentoring has become too generalized and fails to offer a unique individualized experience for underrepresented minorities (URM). The purpose of this paper is to describe the findings of an ongoing qualitative study that investigates the relationships among STEM doctoral faculty and URM students. Several faculty from three different predominately white institutions (PWI) in the Southeastern region of the United States were interviewed and engaged in open dialogue about their experiences with mentoring. The data collection included semi-structured interviews that took place in the classroom (pre-COVID-19) as well as virtually. The theoretical framework draws on the idea of Critical Race Theory and how cultural, social constructs interfere with effective mentoring for URM Doctoral STEM students. The findings in this study suggest that though the faculty and several years of experience mentoring students, there were some gaps in understanding the needs of URM students and how mentoring is a unique relationship that should be specialized for each student and should not fit into one mold.Keywords: culture, critical race theory, mentoring, STEM
Procedia PDF Downloads 19826402 Analysis of Delivery of Quad Play Services
Authors: Rahul Malhotra, Anurag Sharma
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: FTTH, quad play, play service, access networks, data rate
Procedia PDF Downloads 41526401 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 18226400 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 8526399 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization
Authors: Hironori Karachi, Haruka Yamashita
Abstract:
Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.Keywords: data science, non-negative matrix factorization, missing data, quality of services
Procedia PDF Downloads 13126398 Supply Chain Optimisation through Geographical Network Modeling
Authors: Cyrillus Prabandana
Abstract:
Supply chain optimisation requires multiple factors as consideration or constraints. These factors are including but not limited to demand forecasting, raw material fulfilment, production capacity, inventory level, facilities locations, transportation means, and manpower availability. By knowing all manageable factors involved and assuming the uncertainty with pre-defined percentage factors, an integrated supply chain model could be developed to manage various business scenarios. This paper analyse the utilisation of geographical point of view to develop an integrated supply chain network model to optimise the distribution of finished product appropriately according to forecasted demand and available supply. The supply chain optimisation model shows that small change in one supply chain constraint is possible to largely impact other constraints, and the new information from the model should be able to support the decision making process. The model was focused on three areas, i.e. raw material fulfilment, production capacity and finished products transportation. To validate the model suitability, it was implemented in a project aimed to optimise the concrete supply chain in a mining location. The high level of operations complexity and involvement of multiple stakeholders in the concrete supply chain is believed to be sufficient to give the illustration of the larger scope. The implementation of this geographical supply chain network modeling resulted an optimised concrete supply chain from raw material fulfilment until finished products distribution to each customer, which indicated by lower percentage of missed concrete order fulfilment to customer.Keywords: decision making, geographical supply chain modeling, supply chain optimisation, supply chain
Procedia PDF Downloads 34626397 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies
Authors: Margaret S. Wright
Abstract:
Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.Keywords: data management, decision making, disaster planning documentation, public health nursing
Procedia PDF Downloads 22226396 Determinants of Success of University Industry Collaboration in the Science Academic Units at Makerere University
Authors: Mukisa Simon Peter Turker, Etomaru Irene
Abstract:
This study examined factors determining the success of University-Industry Collaboration (UIC) in the science academic units (SAUs) at Makerere University. This was prompted by concerns about weak linkages between industry and the academic units at Makerere University. The study examined institutional, relational, output, and framework factors determining the success of UIC in the science academic units at Makerere University. The study adopted a predictive cross-sectional survey design. Data was collected using a questionnaire survey from 172 academic staff from the six SAUs at Makerere University. Stratified, proportionate, and simple random sampling techniques were used to select the samples. The study used descriptive statistics and linear multiple regression analysis to analyze data. The study findings reveal a coefficient of determination (R-square) of 0.403 at a significance level of 0.000, suggesting that UIC success was 40.3% at a standardized error of estimate of 0.60188. The strength of association between Institutional factors, Relational factors, Output factors, and Framework factors, taking into consideration all interactions among the study variables, was at 64% (R= 0.635). Institutional, Relational, Output and Framework factors accounted for 34% of the variance in the level of UIC success (adjusted R2 = 0.338). The remaining variance of 66% is explained by factors other than Institutional, Relational, Output, and Framework factors. The standardized coefficient statistics revealed that Relational factors (β = 0.454, t = 5.247, p = 0.000) and Framework factors (β = 0.311, t = 3.770, p = 0.000) are the only statistically significant determinants of the success of UIC in the SAU in Makerere University. Output factors (β = 0.082, t =1.096, p = 0.275) and Institutional factors β = 0.023, t = 0.292, p = 0.771) turned out to be statistically insignificant determinants of the success of UIC in the science academic units at Makerere University. The study concludes that Relational Factors and Framework Factors positively and significantly determine the success of UIC, but output factors and institutional factors are not statistically significant determinants of UIC in the SAUs at Makerere University. The study recommends strategies to consolidate Relational and Framework Factors to enhance UIC at Makerere University and further research on the effects of Institutional and Output factors on the success of UIC in universities.Keywords: university-industry collaboration, output factors, relational factors, framework factors, institutional factors
Procedia PDF Downloads 6126395 An Embarrassingly Simple Semi-supervised Approach to Increase Recall in Online Shopping Domain to Match Structured Data with Unstructured Data
Authors: Sachin Nagargoje
Abstract:
Complete labeled data is often difficult to obtain in a practical scenario. Even if one manages to obtain the data, the quality of the data is always in question. In shopping vertical, offers are the input data, which is given by advertiser with or without a good quality of information. In this paper, an author investigated the possibility of using a very simple Semi-supervised learning approach to increase the recall of unhealthy offers (has badly written Offer Title or partial product details) in shopping vertical domain. The author found that the semisupervised learning method had improved the recall in the Smart Phone category by 30% on A=B testing on 10% traffic and increased the YoY (Year over Year) number of impressions per month by 33% at production. This also made a significant increase in Revenue, but that cannot be publicly disclosed.Keywords: semi-supervised learning, clustering, recall, coverage
Procedia PDF Downloads 12226394 Genodata: The Human Genome Variation Using BigData
Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta
Abstract:
Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop
Procedia PDF Downloads 259