Search results for: Errors and Mistakes
261 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos
Authors: Dhanuja S. Patil, Sanjay B. Waykar
Abstract:
Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.Keywords: summarization, detection, Bayesian network, t-cherry tree
Procedia PDF Downloads 326260 Numerical Modeling of Air Shock Wave Generated by Explosive Detonation and Dynamic Response of Structures
Authors: Michał Lidner, Zbigniew SzcześNiak
Abstract:
The ability to estimate blast load overpressure properly plays an important role in safety design of buildings. The issue of studying of blast loading on structural elements has been explored for many years. However, in many literature reports shock wave overpressure is estimated with simplified triangular or exponential distribution in time. This indicates some errors when comparing real and numerical reaction of elements. Nonetheless, it is possible to further improve setting similar to the real blast load overpressure function versus time. The paper presents a method of numerical analysis of the phenomenon of the air shock wave propagation. It uses Finite Volume Method and takes into account energy losses due to a heat transfer with respect to an adiabatic process rule. A system of three equations (conservation of mass, momentum and energy) describes the flow of a volume of gaseous medium in the area remote from building compartments, which can inhibit the movement of gas. For validation three cases of a shock wave flow were analyzed: a free field explosion, an explosion inside a steel insusceptible tube (the 1D case) and an explosion inside insusceptible cube (the 3D case). The results of numerical analysis were compared with the literature reports. Values of impulse, pressure, and its duration were studied. Finally, an overall good convergence of numerical results with experiments was achieved. Also the most important parameters were well reflected. Additionally analyses of dynamic response of one of considered structural element were made.Keywords: adiabatic process, air shock wave, explosive, finite volume method
Procedia PDF Downloads 192259 COVID-19’s Impact on the Use of Media, Educational Performance, and Learning in Children and Adolescents with ADHD Who Engaged in Virtual Learning
Authors: Christina Largent, Tazley Hobbs
Abstract:
Objective: A literature review was performed to examine the existing research on COVID-19 lockdown as it relates to ADHD child/adolescent individuals, media use, and impact on educational performance/learning. It was surmised that with the COVID-19 shut-down and transition to remote learning, a less structured learning environment, increased screen time, in addition to potential difficulty accessing school resources would impair ADHD individuals’ performance and learning. A resulting increase in the number of youths diagnosed and treated for ADHD would be expected. As of yet, there has been little to no published data on the incidence of ADHD as it relates to COVID-19 outside of reports from several nonprofit agencies such as CHADD (Children and Adults with Attention-Deficit/Hyperactivity Disorder ), who reported an increased number of calls to their helpline, The New York based Child Mind Institute, who reported an increased number of appointments to discuss medications, and research released from Athenahealth showing an increase in the number of patients receiving new diagnosis of ADHD and new prescriptions for ADHD medications. Methods: A literature search for articles published between 2020 and 2021 from Pubmed, Google Scholar, PsychInfo, was performed. Search phrases and keywords included “covid, adhd, child, impact, remote learning, media, screen”. Results: Studies primarily utilized parental reports, with very few from the perspective of the ADHD individuals themselves. Most findings thus far show that with the COVID-19 quarantine and transition to online learning, ADHD individuals’ experienced decreased ability to keep focused or adhere to the daily routine, as well as increased inattention-related problems, such as careless mistakes or lack of completion in homework, which in turn translated into overall more difficulty with remote learning. To add further injury, one study showed (just on evaluation of two different sites within the US) that school based services for these individuals decreased with the shift to online-learning. Increased screen time, television, social media, and gaming were noted amongst ADHD individuals. One study further differentiated the degree of digital media, identifying individuals with “problematic “ or “non-problematic” use. ADHD children with problematic digital media use suffered from more severe core symptoms of ADHD, negative emotions, executive function deficits, damage to family environment, pressure from life events, and a lower motivation to learn. Conclusions and Future Considerations: Studies found not only was online learning difficult for ADHD individuals but it, in addition to greater use of digital media, was associated with worsening ADHD symptoms impairing schoolwork, in addition to secondary findings of worsening mood and behavior. Currently, data on the number of new ADHD cases, in addition to data on the prescription and usage of stimulants during COVID-19, has not been well documented or studied; this would be well-warranted out of concern for over diagnosing or over-prescribing our youth. It would also be well-worth studying how reversible or long-lasting these negative impacts may be.Keywords: COVID-19, remote learning, media use, ADHD, child, adolescent
Procedia PDF Downloads 124258 Correction Factors for Soil-Structure Interaction Predicted by Simplified Models: Axisymmetric 3D Model versus Fully 3D Model
Authors: Fu Jia
Abstract:
The effects of soil-structure interaction (SSI) are often studied using axial-symmetric three-dimensional (3D) models to avoid the high computational cost of the more realistic, fully 3D models, which require 2-3 orders of magnitude more computer time and storage. This paper analyzes the error and presents correction factors for system frequency, system damping, and peak amplitude of structural response computed by axisymmetric models, embedded in uniform or layered half-space. The results are compared with those for fully 3D rectangular foundations of different aspect ratios. Correction factors are presented for a range of the model parameters, such as fixed-base frequency, structure mass, height and length-to-width ratio, foundation embedment, soil-layer stiffness and thickness. It is shown that the errors are larger for stiffer, taller and heavier structures, deeper foundations and deeper soil layer. For example, for a stiff structure like Millikan Library (NS response; length-to-width ratio 1), the error is 6.5% in system frequency, 49% in system damping and 180% in peak amplitude. Analysis of a case study shows that the NEHRP-2015 provisions for reduction of base shear force due to SSI effects may be unsafe for some structures and need revision. The presented correction factor diagrams can be used in practical design and other applications.Keywords: 3D soil-structure interaction, correction factors for axisymmetric models, length-to-width ratio, NEHRP-2015 provisions for reduction of base shear force, rectangular embedded foundations, SSI system frequency, SSI system damping
Procedia PDF Downloads 266257 Forecasting Lake Malawi Water Level Fluctuations Using Stochastic Models
Authors: M. Mulumpwa, W. W. L. Jere, M. Lazaro, A. H. N. Mtethiwa
Abstract:
The study considered Seasonal Autoregressive Integrated Moving Average (SARIMA) processes to select an appropriate stochastic model to forecast the monthly data from the Lake Malawi water levels for the period 1986 through 2015. The appropriate model was chosen based on SARIMA (p, d, q) (P, D, Q)S. The Autocorrelation function (ACF), Partial autocorrelation (PACF), Akaike Information Criteria (AIC), Bayesian Information Criterion (BIC), Box–Ljung statistics, correlogram and distribution of residual errors were estimated. The SARIMA (1, 1, 0) (1, 1, 1)12 was selected to forecast the monthly data of the Lake Malawi water levels from August, 2015 to December, 2021. The plotted time series showed that the Lake Malawi water levels are decreasing since 2010 to date but not as much as was the case in 1995 through 1997. The future forecast of the Lake Malawi water levels until 2021 showed a mean of 474.47 m ranging from 473.93 to 475.02 meters with a confidence interval of 80% and 90% against registered mean of 473.398 m in 1997 and 475.475 m in 1989 which was the lowest and highest water levels in the lake respectively since 1986. The forecast also showed that the water levels of Lake Malawi will drop by 0.57 meters as compared to the mean water levels recorded in the previous years. These results suggest that the Lake Malawi water level may not likely go lower than that recorded in 1997. Therefore, utilisation and management of water-related activities and programs among others on the lake should provide room for such scenarios. The findings suggest a need to manage the Lake Malawi jointly and prudently with other stakeholders starting from the catchment area. This will reduce impacts of anthropogenic activities on the lake’s water quality, water level, aquatic and adjacent terrestrial ecosystems thereby ensuring its resilience to climate change impacts.Keywords: forecasting, Lake Malawi, water levels, water level fluctuation, climate change, anthropogenic activities
Procedia PDF Downloads 230256 The Quality Assessment of Seismic Reflection Survey Data Using Statistical Analysis: A Case Study of Fort Abbas Area, Cholistan Desert, Pakistan
Authors: U. Waqas, M. F. Ahmed, A. Mehmood, M. A. Rashid
Abstract:
In geophysical exploration surveys, the quality of acquired data holds significant importance before executing the data processing and interpretation phases. In this study, 2D seismic reflection survey data of Fort Abbas area, Cholistan Desert, Pakistan was taken as test case in order to assess its quality on statistical bases by using normalized root mean square error (NRMSE), Cronbach’s alpha test (α) and null hypothesis tests (t-test and F-test). The analysis challenged the quality of the acquired data and highlighted the significant errors in the acquired database. It is proven that the study area is plain, tectonically least affected and rich in oil and gas reserves. However, subsurface 3D modeling and contouring by using acquired database revealed high degrees of structural complexities and intense folding. The NRMSE had highest percentage of residuals between the estimated and predicted cases. The outcomes of hypothesis testing also proved the biasness and erraticness of the acquired database. Low estimated value of alpha (α) in Cronbach’s alpha test confirmed poor reliability of acquired database. A very low quality of acquired database needs excessive static correction or in some cases, reacquisition of data is also suggested which is most of the time not feasible on economic grounds. The outcomes of this study could be used to assess the quality of large databases and to further utilize as a guideline to establish database quality assessment models to make much more informed decisions in hydrocarbon exploration field.Keywords: Data quality, Null hypothesis, Seismic lines, Seismic reflection survey
Procedia PDF Downloads 165255 Content Analysis of Video Translations: Examining the Linguistic and Thematic Approach by Translator Abdullah Khrief on the X Platform
Authors: Easa Almustanyir
Abstract:
This study investigates the linguistic and thematic approach of translator Abdullah Khrief in the context of video translations on the X platform. The sample comprises 15 videos from Khrief's account, covering diverse content categories like science, religion, social issues, personal experiences, lifestyle, and culture. The analysis focuses on two aspects: language usage and thematic representation. Regarding language, the study examines the prevalence of English while considering the inclusion of French and German content, highlighting Khrief's multilingual versatility and ability to navigate cultural nuances. Thematically, the study explores the diverse range of topics covered, encompassing scientific, religious, social, and personal narratives, underscoring Khrief's broad subject matter expertise and commitment to knowledge dissemination. The study employs a mixed-methods approach, combining quantitative data analysis with qualitative content analysis. Statistical data on video languages, presenter genders, and content categories are analyzed, and a thorough content analysis assesses translation accuracy, cultural appropriateness, and overall quality. Preliminary findings indicate a high level of professionalism and expertise in Khrief's translations. The absence of errors across the diverse range of videos establishes his credibility and trustworthiness. Furthermore, the accurate representation of cultural nuances and sensitive topics highlights Khrief's cultural sensitivity and commitment to preserving intended meanings and emotional resonance.Keywords: audiovisual translation, linguistic versatility, thematic diversity, cultural sensitivity, content analysis, mixed-methods approach
Procedia PDF Downloads 17254 Assessment of the Impact of the Application of Kinesiology Taping on Joint Position Sense in Knee Joint
Authors: Anna Słupik, Patryk Wąsowski, Anna Mosiołek, Dariusz Białoszewski
Abstract:
Introduction: Kinesiology Taping is one of the most popular techniques used for treatment and supporting physiological processes in sports medicine and physiotherapy. Often it is used to sensorimotor skills of lower limbs by athletes. The aim of the study was to determine the effect of the application of muscle Kinesiology Taping to feel the position setting in motion the joint active. Material and methods: The study involved 50 healthy people between 18 and 30 years of age, 30 men and 20 women (mean age 23.24 years). The participants were divided into two groups. The study group was qualified for Kinesiology Taping application (muscle application, type Y, for quadriceps femoris muscle), while the remaining people used the application made of plaster (placebo group). Testing was performed prior to applying taping, with the applied application (after 30 minutes), then 24 hours after wearing, and after removing the tape. Each evaluated joint position sense - Error of Active Reproduction of Joint Position. Results: The survey revealed no significant differences in measurement between the study group and the placebo group (p> 0.05). No significant differences in time taking into account all four measurements in the group with the applied CT application, which was supported by pairs (p> 0.05). Also in the placebo group showed no significant differences over time (p> 0.05). There was no significant difference between the errors committed in the direction of flexion and extension. Conclusions: 1. Application muscle Kinesiology Taping had no significant effect on the knee joint proprioception. Its use in order to improve sensorimotor seems therefore unjustified. 2. There are no differences between applications Kinesiology Taping and placebo indicates that the clinical effect of stretch tape is minimal or absent. 3. The results are the basis for the continuation of prospective, randomized trials of numerous and study group.Keywords: joint position sense, kinesiology taping, knee joint, proprioception
Procedia PDF Downloads 403253 Digital Health During a Pandemic: Critical Analysis of the COVID-19 Contact Tracing Apps
Authors: Mohanad Elemary, Imose Itua, Rajeswari B. Matam
Abstract:
Virologists and public health experts have been predicting potential pandemics from coronaviruses for decades. The viruses which caused the SARS and MERS pandemics and the Nipah virus led to many lost lives, but still, the COVID-19 pandemic caused by the SARS-CoV2 virus surprised many scientific communities, experts, and governments with its ease of transmission and its pathogenicity. Governments of various countries reacted by locking down entire populations to their homes to combat the devastation caused by the virus, which led to a loss of livelihood and economic hardship to many individuals and organizations. To revive national economies and support their citizens in resuming their lives, governments focused on the development and use of contact tracing apps as a digital way to track and trace exposure. Google and Apple introduced the Exposure Notification Systems (ENS) framework. Independent organizations and countries also developed different frameworks for contact tracing apps. The efficiency, popularity, and adoption rate of these various apps have been different across countries. In this paper, we present a critical analysis of the different contact tracing apps with respect to their efficiency, adoption rate and general perception, and the governmental strategies and policies, which led to the development of the applications. When it comes to the European countries, each of them followed an individualistic approach to the same problem resulting in different realizations of a similarly functioning application with differing results of use and acceptance. The study conducted an extensive review of existing literature, policies, and reports across multiple disciplines, from which a framework was developed and then validated through interviews with six key stakeholders in the field, including founders and executives in digital health startups and corporates as well as experts from international organizations like The World Health Organization. A framework of best practices and tactics is the result of this research. The framework looks at three main questions regarding the contact tracing apps; how to develop them, how to deploy them, and how to regulate them. The findings are based on the best practices applied by governments across multiple countries, the mistakes they made, and the best practices applied in similar situations in the business world. The findings include multiple strategies when it comes to the development milestone regarding establishing frameworks for cooperation with the private sector and how to design the features and user experience of the app for a transparent, effective, and rapidly adaptable app. For the deployment section, several tactics were discussed regarding communication messages, marketing campaigns, persuasive psychology, and the initial deployment scale strategies. The paper also discusses the data privacy dilemma and how to build for a more sustainable system of health-related data processing and utilization. This is done through principles-based regulations specific for health data to allow for its avail for the public good. This framework offers insights into strategies and tactics that could be implemented as protocols for future public health crises and emergencies whether global or regional.Keywords: contact tracing apps, COVID-19, digital health applications, exposure notification system
Procedia PDF Downloads 135252 Development of E-Tendering Models for Nigerian Public Procuring Entities
Authors: Bello Abdullahi, Kabir Bala, Yahaya M. Ibrahim, Ahmed D. Ibrahim
Abstract:
Public sector tendering has traditionally been conducted using manual paper-based processes which are known to be inefficient, less transparent, and more prone to manipulations and errors. However, the advent of the Internet and its associated technologies has led to the development of numerous e-Tendering systems that addressed many of the problems associated with the manual paper-based tendering system. Currently, in Nigeria, the public tendering processes are largely conducted based on manual paper-based system that is bedevilled by a number of problems such as inordinate delays, inefficiencies, manipulation of the tender evaluation process, corruption, lack of transparency and competition, among other problems. These problems can be addressed through the adoption of existing web-based e-Tendering systems which are known to address most of these problems. However, these existing e-Tendering systems that have been developed are not based on the Nigerian legal procurement processes and as such their suitability for local application is very limited. This paper is part of a larger study that attempt to address this problem through the development of an e-Tendering system that is based on the requirements of the Nigerian public procuring entities. In this paper, the identified tendering processes commonly used by Nigerian public procuring entities in the selection of construction sources are presented. A multi-methods research approach was used to identify those tendering processes. Specifically, 19 existing business use cases used by Nigerian public procuring entities were identified and 61 system use cases were prescribed based on the identified business use cases. The use cases were used as the basis for the development of domain and software conceptual models. The models were successfully used to guide the development of an e-Tendering system called NPS-eTender. Ripple and Unified Process were adopted as the software development methodologies.Keywords: e-tendering, e-procurement, requirement model, conceptual model, public sector tendering, public procurement
Procedia PDF Downloads 195251 The Measurement of City Brand Effectiveness as Methodological and Strategic Challenge: Insights from Individual Interviews with International Experts
Authors: A. Augustyn, M. Florek, M. Herezniak
Abstract:
Since the public authorities are constantly pressured by the public opinion to showcase the tangible and measurable results of their efforts, the evaluation of place brand-related activities becomes a necessity. Given the political and social character of place branding process, the legitimization of the branding efforts requires the compliance of the objectives set out in the city brand strategy with the actual needs, expectations, and aspirations of various internal stakeholders. To deliver on the diverse promises, city authorities and brand managers need to translate them into the measurable indicators against which the brand strategy effectiveness will be evaluated. In concert with these observations are the findings from branding and marketing literature with a widespread consensus that places should adopt a more systematic and holistic approach in order to ensure the performance of their brands. However, the measurement of the effectiveness of place branding remains insufficiently explored in theory, even though it is considered a significant step in the process of place brand management. Therefore, the aim of the research presented in the current paper was to collect insights on the nature of effectiveness measurement of city brand strategies and to juxtapose these findings with the theoretical assumptions formed on the basis of the state-of-the-art literature review. To this end, 15 international academic experts (out of 18 initially selected) with affiliation from ten countries (five continents), were individually interviewed. The standardized set of 19 open-ended questions was used for all the interviewees, who had been selected based on their expertise and reputation in the fields of place branding/marketing. Findings were categorized into four modules: (i) conceptualizations of city brand effectiveness, (ii) methodological issues of city brand effectiveness measurement, (iii) the nature of measurement process, (iv) articulation of key performance indicators (KPIs). Within each module, the interviewees offered diverse insights into the subject based on their academic expertise and professional activity as consultants. They proposed that there should be a twofold understanding of effectiveness. The narrow one when it is conceived as the aptitude to achieve specific goals, and the broad one in which city brand effectiveness is seen as an increase in social and economic reality of a place, which in turn poses diverse challenges for the measurement concepts and processes. Moreover, the respondents offered a variety of insights into the methodological issues, particularly about the need for customization and flexibility of the measurement systems, for the employment of interdisciplinary approach to measurement and implications resulting therefrom. Considerable emphasis was put on the inward approach to measurement, namely the necessity to monitor the resident’s evaluation of brand related activities instead of benchmarking cities against the competitive set. Other findings encompass the issues of developing appropriate KPIs for the city brand, managing the measurement process and the inclusion of diverse stakeholders to produce a sound measurement system. Furthermore, the interviewees enumerated the most frequently made mistakes in measurement mainly resulting from the misunderstanding of the nature of city brands. This research was financed by the National Science Centre, Poland, research project no. 2015/19/B/HS4/00380 Towards the categorization of place brand strategy effectiveness indicators – findings from strategic documents of Polish district cities – theoretical and empirical approach.Keywords: city branding, effectiveness, experts’ insights, measurement
Procedia PDF Downloads 145250 Optimization of Assembly and Welding of Complex 3D Structures on the Base of Modeling with Use of Finite Elements Method
Authors: M. N. Zelenin, V. S. Mikhailov, R. P. Zhivotovsky
Abstract:
It is known that residual welding deformations give negative effect to processability and operational quality of welded structures, complicating their assembly and reducing strength. Therefore, selection of optimal technology, ensuring minimum welding deformations, is one of the main goals in developing a technology for manufacturing of welded structures. Through years, JSC SSTC has been developing a theory for estimation of welding deformations and practical activities for reducing and compensating such deformations during welding process. During long time a methodology was used, based on analytic dependence. This methodology allowed defining volumetric changes of metal due to welding heating and subsequent cooling. However, dependences for definition of structures deformations, arising as a result of volumetric changes of metal in the weld area, allowed performing calculations only for simple structures, such as units, flat sections and sections with small curvature. In case of complex 3D structures, estimations on the base of analytic dependences gave significant errors. To eliminate this shortage, it was suggested to use finite elements method for resolving of deformation problem. Here, one shall first calculate volumes of longitudinal and transversal shortenings of welding joints using method of analytic dependences and further, with obtained shortenings, calculate forces, which action is equivalent to the action of active welding stresses. Further, a finite-elements model of the structure is developed and equivalent forces are added to this model. Having results of calculations, an optimal sequence of assembly and welding is selected and special measures to reduce and compensate welding deformations are developed and taken.Keywords: residual welding deformations, longitudinal and transverse shortenings of welding joints, method of analytic dependences, finite elements method
Procedia PDF Downloads 409249 Energy Consumption Estimation for Hybrid Marine Power Systems: Comparing Modeling Methodologies
Authors: Kamyar Maleki Bagherabadi, Torstein Aarseth Bø, Truls Flatberg, Olve Mo
Abstract:
Hydrogen fuel cells and batteries are one of the promising solutions aligned with carbon emission reduction goals for the marine sector. However, the higher installation and operation costs of hydrogen-based systems compared to conventional diesel gensets raise questions about the appropriate hydrogen tank size, energy, and fuel consumption estimations. Ship designers need methodologies and tools to calculate energy and fuel consumption for different component sizes to facilitate decision-making regarding feasibility and performance for retrofits and design cases. The aim of this work is to compare three alternative modeling approaches for the estimation of energy and fuel consumption with various hydrogen tank sizes, battery capacities, and load-sharing strategies. A fishery vessel is selected as an example, using logged load demand data over a year of operations. The modeled power system consists of a PEM fuel cell, a diesel genset, and a battery. The methodologies used are: first, an energy-based model; second, considering load variations during the time domain with a rule-based Power Management System (PMS); and third, a load variations model and dynamic PMS strategy based on optimization with perfect foresight. The errors and potentials of the methods are discussed, and design sensitivity studies for this case are conducted. The results show that the energy-based method can estimate fuel and energy consumption with acceptable accuracy. However, models that consider time variation of the load provide more realistic estimations of energy and fuel consumption regarding hydrogen tank and battery size, still within low computational time.Keywords: fuel cell, battery, hydrogen, hybrid power system, power management system
Procedia PDF Downloads 36248 The Effect of Foundation on the Earth Fill Dam Settlement
Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh
Abstract:
Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting
Procedia PDF Downloads 195247 An Analysis of Prefabricated Construction Waste: A Case Study Approach
Authors: H. Hakim, C. Kibert, C. Fabre, S. Monadizadeh
Abstract:
Construction industry is an industry saddled with chronic problems of high waste generation. Waste management that is to ensure materials are utilized in an efficient manner would make a major contribution to mitigating the negative environmental impacts of construction waste including finite resources depletion and growing occupied landfill areas to name a few. Furthermore, ‘material resource efficiency’ has been found an economically smart approach specially when considered during the design phase. One effective strategy is to utilizing off-site construction process which includes a series of prefabricated systems such as mobile, modular, and HUD construction (Department of Housing and Urban Development manufactured buildings). These types of buildings are by nature material and resource-efficient. Despite conventional construction that is exposed to adverse weather conditions, manufactured construction production line is capable of creating repetitive units in a factory controlled environment. A factory can have several parallel projects underway with a high speed and in a timely manner which simplifies the storage of excess materials and re-allocating to the next projects. The literature reports that prefabricated construction significantly helps reduce errors, site theft, rework, and delayed problems and can ultimately lead to a considerable waste reduction. However, there is not sufficient data to quantify this reduction when it comes to a regular modular house in the U.S. Therefore, this manuscript aims to provide an analysis of waste originated from a manufactured factory trend. The analysis was made possible with several visits and data collection of Homes of Merits, a Florida Manufactured and Modular Homebuilder. The results quantify and verify a noticeable construction waste reduction.Keywords: construction waste, modular construction, prefabricated buildings, waste management
Procedia PDF Downloads 267246 Category-Base Theory of the Optimum Signal Approximation Clarifying the Importance of Parallel Worlds in the Recognition of Human and Application to Secure Signal Communication with Feedback
Authors: Takuro Kida, Yuichi Kida
Abstract:
We show a base of the new trend of algorithm mathematically that treats a historical reason of continuous discrimination in the world as well as its solution by introducing new concepts of parallel world that includes an invisible set of errors as its companion. With respect to a matrix operator-filter bank that the matrix operator-analysis-filter bank H and the matrix operator-sampling-filter bank S are given, firstly, we introduce the detailed algorithm to derive the optimum matrix operator-synthesis-filter bank Z that minimizes all the worst-case measures of the matrix operator-error-signals E(ω) = F(ω) − Y(ω) between the matrix operator-input-signals F(ω) and the matrix operator-output signals Y(ω) of the matrix operator-filter bank at the same time. Further, feedback is introduced to the above approximation theory and it is indicated that introducing conversations with feedback does not superior automatically to the accumulation of existing knowledge of signal prediction. Secondly, the concept of category in the field of mathematics is applied to the above optimum signal approximation and is indicated that the category-based approximation theory is applied to the set-theoretic consideration of the recognition of humans. Based on this discussion, it is shown naturally why the narrow perception that tends to create isolation shows an apparent advantage in the short term and, often, why such narrow thinking becomes intimate with discriminatory action in a human group. Throughout these considerations, it is presented that, in order to abolish easy and intimate discriminatory behavior, it is important to create a parallel world of conception where we share the set of invisible error signals, including the words and the consciousness of both worlds.Keywords: signal prediction, pseudo inverse matrix, artificial intelligence, conditional optimization
Procedia PDF Downloads 156245 Impact Analysis of Quality Control Practices in Veterinary Diagnostic Labs in Lahore, Pakistan
Authors: Faiza Marrium, Masood Rabbani, Ali Ahmad Sheikh, Muhammad Yasin Tipu Javed Muhammad, Sohail Raza
Abstract:
More than 75% diseases spreading in the past 10 years in human population globally are linked to veterinary sector. Veterinary diagnostic labs are the powerful ally for diagnosis, prevention and monitoring of animal diseases in any country. In order to avoid detrimental effects of errors in disease diagnostic and biorisk management, there is a dire need to establish quality control system. In current study, 3 private and 6 public sectors veterinary diagnostic labs were selected for survey. A questionnaire survey in biorisk management guidelines of CWA 15793 was designed to find quality control breaches in lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care. The data was analyzed through frequency distribution statistically by using (SPSS) version 18.0. A non-significant difference was found in all parameters of lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care with an average percentage of 46.6, 57.77, 52.7, 55.5, 54.44, 48.88 and 60, respectively. A non-significant difference among all nine labs were found, with highest average compliance percentage of all parameters are lab 2 (78.13), Lab 3 (70.56), Lab 5 (57.51), Lab 6 (56.37), Lab 4 (55.02), Lab 9 (49.58), Lab 7 (47.76), Lab 1 (41.01) and Lab 8 (36.09). This study shows that in Lahore district veterinary diagnostic labs are not giving proper attention to quality of their system and there is no significant difference between setups of private and public sector laboratories. These results show that most of parameters are between 50 and 80 percent, which needs some work and improvement as per WHO criteria.Keywords: veterinary lab, quality management system, accreditation, regulatory body, disease identification
Procedia PDF Downloads 146244 Scenario-Based Learning Using Virtual Optometrist Applications
Authors: J. S. M. Yang, G. E. T. Chua
Abstract:
Diploma in Optometry (OPT) course is a three-year program offered by Ngee Ann Polytechnic (NP) to train students to provide primary eye care. Students are equipped with foundational conceptual knowledge and practical skills in the first three semesters before clinical modules in fourth to six semesters. In the clinical modules, students typically have difficulties in integrating the acquired knowledge and skills from the past semesters to perform general eye examinations on public patients at NP Optometry Centre (NPOC). To help the students overcome the challenge, a web-based game Virtual Optometrist (VO) was developed to help students apply their skills and knowledge through scenario-based learning. It consisted of two interfaces, Optical Practice Counter (OPC) and Optometric Consultation Room (OCR), to provide two simulated settings for authentic learning experiences. In OPC, students would recommend and provide appropriate frame and lens selection based on virtual patient’s case history. In OCR, students would diagnose and manage virtual patients with common ocular conditions. Simulated scenarios provided real-world clinical situations that required contextual application of integrated knowledge from relevant modules. The stages in OPC and OCR are of increasing complexity to align to expected students’ clinical competency as they progress to more senior semesters. This prevented gameplay fatigue as VO was used over the semesters to achieve different learning outcomes. Numerous feedback opportunities were provided to students based on their decisions to allow individualized learning to take place. The game-based learning element in VO was achieved through the scoreboard and leader board to enhance students' motivation to perform. Scores were based on the speed and accuracy of students’ responses to the questions posed in the simulated scenarios, preparing the students to perform accurately and effectively under time pressure in a realistic optometric environment. Learning analytics was generated in VO’s backend office based on students’ responses, offering real-time data on distinctive and observable learners’ behavior to monitor students’ engagement and learning progress. The backend office allowed versatility to add, edit, and delete scenarios for different intended learning outcomes. Likert Scale was used to measure students’ learning experience with VO for OPT Year 2 and 3 students. The survey results highlighted the learning benefits of implementing VO in the different modules, such as enhancing recall and reinforcement of clinical knowledge for contextual application to develop higher-order thinking skills, increasing efficiency in clinical decision-making, facilitating learning through immediate feedback and second attempts, providing exposure to common and significant ocular conditions, and training effective communication skills. The results showed that VO has been useful in reinforcing optometry students’ learning and supporting the development of higher-order thinking, increasing efficiency in clinical decision-making, and allowing students to learn from their mistakes with immediate feedback and second attempts. VO also exposed the students to diverse ocular conditions through simulated real-world clinical scenarios, which may otherwise not be encountered in NPOC, and promoted effective communication skills.Keywords: authentic learning, game-based learning, scenario-based learning, simulated clinical scenarios
Procedia PDF Downloads 117243 Simulation-Based Validation of Safe Human-Robot-Collaboration
Authors: Titanilla Komenda
Abstract:
Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.Keywords: human-machine-system, human-robot-collaboration, safety, simulation
Procedia PDF Downloads 361242 The Beauty of Islamic Etiquette: How an Elegant Muslim Woman Represents Her Culture in a Multicultural Society
Authors: Julia A. Ermakova
Abstract:
As a member of a multicultural society, it is imperative that individuals demonstrate the highest level of decorum in order to exemplify the beauty of their culture. Adab, the practice of praiseworthy words and deeds, as well as possessing good manners and pursuing that which is considered good, is a fundamental concept that guards against all types of mistakes. In Islam, etiquette for every situation in life is taught, and it constitutes the way of life for a Muslim. In light of this, the personality of an elegant Muslim woman can be described as one who embodies the following qualities: Firstly, cultural speech and erudition are essential components. Improving one's intellect, learning new things, reading diverse literature, expanding one's vocabulary, working on articulation, and avoiding obscene speech and verbosity are crucial. Additionally, listening more than speaking and being willing to discuss one's culture when asked are commendable qualities. Conversely, it is important to avoid discussing foolish matters with foolish people and to be able to respond appropriately and change the subject if someone attempts to hurt or manipulate. Secondly, the style of speech is also of paramount importance. It is recommended to speak in a measured tone with a quiet voice and deep breathing. Avoiding rushing and shortness of breath is also recommended. Thirdly, awareness of how to greet others is essential. Combining Shariah and small talk etiquette, such as making a gesture of respect by putting one's hand to the chest and smiling slightly when a man offers a handshake, is recommended. Understanding the rules of small talk, taboo topics, and self-presentation is also important. Fourthly, knowing how to give and receive compliments without devaluing them is imperative. Knowledge of the rules of good manners and etiquette, both secular and Shariah, is also essential. Fifthly, avoiding arguments and responding elegantly to rudeness and tactlessness is a sign of an elegant Muslim woman. Treating everyone with respect and avoiding prejudices, taboo topics, inappropriate questions, and bad habits are all aspects of politeness. Sixthly, a neat appearance appropriate to Shariah and the local community, as well as a well-put-together outfit with a touch of elegance and style, are crucial. Posture, graceful movement, and a pleasant gaze are also important. Finally, good spirits and inner calm are key to projecting a harmonious image, which encourages people to listen attentively. Giving thanks to Allah in every situation in life is the key to maintaining good spirits. In conclusion, an elegant Muslim woman in a multicultural society is characterized by her high moral qualities and adherence to Islamic etiquette. These qualities, such as cultural speech and erudition, style of speech, awareness of how to greet, knowledge of good manners and etiquette, avoiding arguments, politeness, a neat appearance, and good spirits, all contribute to projecting an image of elegance and respectability. By exemplifying these qualities, Muslim women can serve as positive ambassadors for their culture and religion in diverse societies.Keywords: adab, elegance, muslim woman, multicultural societies, good manners, etiquette
Procedia PDF Downloads 69241 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 579240 Probability Sampling in Matched Case-Control Study in Drug Abuse
Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell
Abstract:
Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling
Procedia PDF Downloads 493239 Chronolgy and Developments in Inventory Control Best Practices for FMCG Sector
Authors: Roopa Singh, Anurag Singh, Ajay
Abstract:
Agriculture contributes a major share in the national economy of India. A major portion of Indian economy (about 70%) depends upon agriculture as it forms the main source of income. About 43% of India’s geographical area is used for agricultural activity which involves 65-75% of total population of India. The given work deals with the Fast moving Consumer Goods (FMCG) industries and their inventories which use agricultural produce as their raw material or input for their final product. Since the beginning of inventory practices, many developments took place which can be categorised into three phases, based on the review of various works. The first phase is related with development and utilization of Economic Order Quantity (EOQ) model and methods for optimizing costs and profits. Second phase deals with inventory optimization method, with the purpose of balancing capital investment constraints and service level goals. The third and recent phase has merged inventory control with electrical control theory. Maintenance of inventory is considered negative, as a large amount of capital is blocked especially in mechanical and electrical industries. But the case is different in food processing and agro-based industries and their inventories due to cyclic variation in the cost of raw materials of such industries which is the reason for selection of these industries in the mentioned work. The application of electrical control theory in inventory control makes the decision-making highly instantaneous for FMCG industries without loss in their proposed profits, which happened earlier during first and second phases, mainly due to late implementation of decision. The work also replaces various inventories and work-in-progress (WIP) related errors with their monetary values, so that the decision-making is fully target-oriented.Keywords: control theory, inventory control, manufacturing sector, EOQ, feedback, FMCG sector
Procedia PDF Downloads 353238 An Experiment Research on the Effect of Brain-Break in the Classroom on Elementary School Students’ Selective Attention
Authors: Hui Liu, Xiaozan Wang, Jiarong Zhong, Ziming Shao
Abstract:
Introduction: Related research shows that students don’t concentrate on teacher’s speaking in the classroom. The d2 attention test is a time-limited test about selective attention. The d2 attention test can be used to evaluate individual selective attention. Purpose: To use the d2 attention test tool to measure the difference between the attention level of the experimental class and the control class before and after Brain-Break and to explore the effect of Brain-Break in the classroom on students' selective attention. Methods: According to the principle of no difference in pre-test data, two classes in the fourth- grade of Shenzhen Longhua Central Primary School were selected. After 20 minutes of class in the third class in the morning and the third class in the afternoon, about 3-minute Brain-Break intervention was performed in the experimental class for 10 weeks. The normal class in the control class did not intervene. Before and after the experiment, the d2 attention test tool was used to test the attention level of the two-class students. The paired sample t-test and independent sample t-test in SPSS 23.0 was used to test the change in the attention level of the two-class classes around 10 weeks. This article only presents results with significant differences. Results: The independent sample t-test results showed that after ten-week of Brain-Break, the missed errors (E1 t = -2.165 p = 0.042), concentration performance (CP t = 1.866 p = 0.05), and the degree of omissions (Epercent t = -2.375 p = 0.029) in experimental class showed significant differences compared with control class. The students’ error level decreased and the concentration increased. Conclusions: Adding Brain-Break interventions in the classroom can effectively improve the attention level of fourth-grade primary school students to a certain extent, especially can improve the concentration of attention and decrease the error rate in the tasks. The new sport's learning model is worth promotingKeywords: cultural class, micromotor, attention, D2 test
Procedia PDF Downloads 132237 Behavioral and EEG Reactions in Children during Recognition of Emotionally Colored Sentences That Describe the Choice Situation
Authors: Tuiana A. Aiusheeva, Sergey S. Tamozhnikov, Alexander E. Saprygin, Arina A. Antonenko, Valentina V. Stepanova, Natalia N. Tolstykh, Alexander N. Savostyanov
Abstract:
Situation of choice is an important condition for the formation of essential character qualities of a child, such as being initiative, responsible, hard-working. We have studied the behavioral and EEG reactions in Russian schoolchildren during recognition of syntactic errors in emotionally colored sentences that describe the choice situation. Twenty healthy children (mean age 9,0±0,3 years, 12 boys, 8 girls) were examined. Forty sentences were selected for the experiment; the half of them contained a syntactic error. The experiment additionally had the hidden condition: 50% of the sentences described the children's own choice and were emotionally colored (positive or negative). The other 50% of the sentences described the forced-choice situation, also with positive or negative coloring. EEG were recorded during execution of error-recognition task. Reaction time and quality of syntactic error detection were chosen as behavioral measures. Event-related spectral perturbation (ERSP) was applied to characterize the oscillatory brain activity of children. There were two time-frequency intervals in EEG reactions: (1) 500-800 ms in the 3-7 Hz frequency range (theta synchronization) and (2) 500-1000 ms in the 8-12 Hz range (alpha desynchronization). We found out that behavioral and brain reactions in child brain during recognition of positive and negative sentences describing forced-choice situation did not have significant differences. Theta synchronization and alpha desynchronization were stronger during recognition of sentences with children's own choice, especially with negative coloring. Also, the quality and execution time of the task were higher for this types of sentences. The results of our study will be useful for improvement of teaching methods and diagnostics of children affective disorders.Keywords: choice situation, electroencephalogram (EEG), emotionally colored sentences, schoolchildren
Procedia PDF Downloads 269236 Modelling of Heat Generation in a 18650 Lithium-Ion Battery Cell under Varying Discharge Rates
Authors: Foo Shen Hwang, Thomas Confrey, Stephen Scully, Barry Flannery
Abstract:
Thermal characterization plays an important role in battery pack design. Lithium-ion batteries have to be maintained between 15-35 °C to operate optimally. Heat is generated (Q) internally within the batteries during both the charging and discharging phases. This can be quantified using several standard methods. The most common method of calculating the batteries heat generation is through the addition of both the joule heating effects and the entropic changes across the battery. In addition, such values can be derived by identifying the open-circuit voltage (OCV), nominal voltage (V), operating current (I), battery temperature (T) and the rate of change of the open-circuit voltage in relation to temperature (dOCV/dT). This paper focuses on experimental characterization and comparative modelling of the heat generation rate (Q) across several current discharge rates (0.5C, 1C, and 1.5C) of a 18650 cell. The analysis is conducted utilizing several non-linear mathematical functions methods, including polynomial, exponential, and power models. Parameter fitting is carried out over the respective function orders; polynomial (n = 3~7), exponential (n = 2) and power function. The generated parameter fitting functions are then used as heat source functions in a 3-D computational fluid dynamics (CFD) solver under natural convection conditions. Generated temperature profiles are analyzed for errors based on experimental discharge tests, conducted at standard room temperature (25°C). Initial experimental results display low deviation between both experimental and CFD temperature plots. As such, the heat generation function formulated could be easier utilized for larger battery applications than other methods available.Keywords: computational fluid dynamics, curve fitting, lithium-ion battery, voltage drop
Procedia PDF Downloads 95235 Investment and Economic Growth: An Empirical Analysis for Tanzania
Authors: Manamba Epaphra
Abstract:
This paper analyzes the causal effect between domestic private investment, public investment, foreign direct investment and economic growth in Tanzania during the 1970-2014 period. The modified neo-classical growth model that includes control variables such as trade liberalization, life expectancy and macroeconomic stability proxied by inflation is used to estimate the impact of investment on economic growth. Also, the economic growth models based on Phetsavong and Ichihashi (2012), and Le and Suruga (2005) are used to estimate the crowding out effect of public investment on private domestic investment on one hand and foreign direct investment on the other hand. A correlation test is applied to check the correlation among independent variables, and the results show that there is very low correlation suggesting that multicollinearity is not a serious problem. Moreover, the diagnostic tests including RESET regression errors specification test, Breusch-Godfrey serial correlation LM test, Jacque-Bera-normality test and white heteroskedasticity test reveal that the model has no signs of misspecification and that, the residuals are serially uncorrelated, normally distributed and homoskedastic. Generally, the empirical results show that the domestic private investment plays an important role in economic growth in Tanzania. FDI also tends to affect growth positively, while control variables such as high population growth and inflation appear to harm economic growth. Results also reveal that control variables such as trade openness and life expectancy improvement tend to increase real GDP growth. Moreover, a revealed negative, albeit weak, association between public and private investment suggests that the positive effect of domestic private investment on economic growth reduces when public investment-to-GDP ratio exceeds 8-10 percent. Thus, there is a great need for promoting domestic saving so as to encourage domestic investment for economic growth.Keywords: FDI, public investment, domestic private investment, crowding out effect, economic growth
Procedia PDF Downloads 290234 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 136233 The Use of Unmanned Aerial System (UAS) in Improving the Measurement System on the Example of Textile Heaps
Authors: Arkadiusz Zurek
Abstract:
The potential of using drones is visible in many areas of logistics, especially in terms of their use for monitoring and control of many processes. The technologies implemented in the last decade concern new possibilities for companies that until now have not even considered them, such as warehouse inventories. Unmanned aerial vehicles are no longer seen as a revolutionary tool for Industry 4.0, but rather as tools in the daily work of factories and logistics operators. The research problem is to develop a method for measuring the weight of goods in a selected link of the clothing supply chain by drones. However, the purpose of this article is to analyze the causes of errors in traditional measurements, and then to identify adverse events related to the use of drones for the inventory of a heap of textiles intended for production purposes. On this basis, it will be possible to develop guidelines to eliminate the causes of these events in the measurement process using drones. In a real environment, work was carried out to determine the volume and weight of textiles, including, among others, weighing a textile sample to determine the average density of the assortment, establishing a local geodetic network, terrestrial laser scanning and photogrammetric raid using an unmanned aerial vehicle. As a result of the analysis of measurement data obtained in the facility, the volume and weight of the assortment and the accuracy of their determination were determined. In this article, this work presents how such heaps are currently being tested, what adverse events occur, indicate and describes the current use of photogrammetric techniques of this type of measurements so far performed by external drones for the inventory of wind farms or construction of the station and compare them with the measurement system of the aforementioned textile heap inside a large-format facility.Keywords: drones, unmanned aerial system, UAS, indoor system, security, process automation, cost optimization, photogrammetry, risk elimination, industry 4.0
Procedia PDF Downloads 86232 Enhancing Warehousing Operation In Cold Supply Chain Through The Use Of IOT And Lifi Technologies
Authors: Sarah El-Gamal, Passent Hossam, Ahmed Abd El Aziz, Rojina Mahmoud, Ahmed Hassan, Dalia Hilal, Eman Ayman, Hana Haytham, Omar Khamis
Abstract:
Several concerns fall upon the supply chain, especially the cold supply chain. According to the literature, the main challenges in the cold supply chain are the distribution and storage phases. In this research, researchers focused on the storage area, which contains several activities such as the picking activity that faces a lot of obstacles and challenges The implementation of IoT solutions enables businesses to monitor the temperature of food items, which is perhaps the most critical parameter in cold chains. Therefore, researchers proposed a practical solution that would help in eliminating the problems related to ineffective picking for products, especially fish and seafood products, by using IoT technology, most notably LiFi technology. Thus, guaranteeing sufficient picking, reducing waste, and consequently lowering costs. A prototype was specially designed and examined. This research is a single case study research. Two methods of data collection were used; observation and semi-structured interviews. Semi-structured interviews were conducted with managers and decision maker at Carrefour Alexandria to validate the problem and the proposed practical solution using IoTandLiFi technology. A total of three interviews were conducted. As a result, a SWOT analysis was achieved in order to highlight all the strengths and weaknesses of using the recommended Lifi solution in the picking process. According to the investigations, it was found that the use of IoT and LiFi technology is cost effective, efficient, and reduces human errors, minimize the percentage of product waste and thus save money and cost. Thus, increasing customer satisfaction and profits gained.Keywords: cold supply chain, picking process, temperature control, IOT, warehousing, LIFI
Procedia PDF Downloads 192