Search results for: complexity measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4164

Search results for: complexity measurement

3744 Corrosion Inhibition of Copper in 1M HNO3 Solution by Oleic Acid

Authors: S. Nigri, R. Oumeddour, F. Djazi

Abstract:

The inhibition of the corrosion of copper in 1 M HNO3 solution by oleic acid was investigated by weight loss measurement, potentiodynamic polarization and scanning electron microscope (SEM) studies. The experimental results have showed that this compound revealed a good corrosion inhibition and the inhibition efficiency is increased with the inhibitor concentration to reach 98%. The results obtained revealed that the adsorption of the inhibitor molecule onto metal surface is found to obey Langmuir adsorption isotherm. The temperature effect on the corrosion behavior of copper in 1 M HNO3 without and with inhibitor at different concentration was studied in the temperature range from 303 to 333 K and the kinetic parameters activation such as Ea, ∆Ha and ∆Sa were evaluated. Tafel plot analysis revealed that oleic acid acts as a mixed type inhibitor. SEM analysis substantiated the formation of protective layer over the copper surface.

Keywords: oleic acid, weight loss, electrochemical measurement, SEM analysis

Procedia PDF Downloads 372
3743 Modular Robotics and Terrain Detection Using Inertial Measurement Unit Sensor

Authors: Shubhakar Gupta, Dhruv Prakash, Apoorv Mehta

Abstract:

In this project, we design a modular robot capable of using and switching between multiple methods of propulsion and classifying terrain, based on an Inertial Measurement Unit (IMU) input. We wanted to make a robot that is not only intelligent in its functioning but also versatile in its physical design. The advantage of a modular robot is that it can be designed to hold several movement-apparatuses, such as wheels, legs for a hexapod or a quadpod setup, propellers for underwater locomotion, and any other solution that may be needed. The robot takes roughness input from a gyroscope and an accelerometer in the IMU, and based on the terrain classification from an artificial neural network; it decides which method of propulsion would best optimize its movement. This provides the bot with adaptability over a set of terrains, which means it can optimize its locomotion on a terrain based on its roughness. A feature like this would be a great asset to have in autonomous exploration or research drones.

Keywords: modular robotics, terrain detection, terrain classification, neural network

Procedia PDF Downloads 119
3742 Location Detection of Vehicular Accident Using Global Navigation Satellite Systems/Inertial Measurement Units Navigator

Authors: Neda Navidi, Rene Jr. Landry

Abstract:

Vehicle tracking and accident recognizing are considered by many industries like insurance and vehicle rental companies. The main goal of this paper is to detect the location of a car accident by combining different methods. The methods, which are considered in this paper, are Global Navigation Satellite Systems/Inertial Measurement Units (GNSS/IMU)-based navigation and vehicle accident detection algorithms. They are expressed by a set of raw measurements, which are obtained from a designed integrator black box using GNSS and inertial sensors. Another concern of this paper is the definition of accident detection algorithm based on its jerk to identify the position of that accident. In fact, the results convinced us that, even in GNSS blockage areas, the position of the accident could be detected by GNSS/INS integration with 50% improvement compared to GNSS stand alone.

Keywords: driver behavior monitoring, integration, IMU, GNSS, monitoring, tracking

Procedia PDF Downloads 200
3741 Customer Adoption and Attitudes in Mobile Banking in Sri Lanka

Authors: Prasansha Kumari

Abstract:

This paper intends to identify and analyze customer adoption and attitudes towards mobile banking facilities. The study uses six perceived characteristics of innovation that can be used to form a favorable or unfavorable attitude toward an innovation, namely: Relative advantage, compatibility, complexity, trailability, risk, and observability. Collected data were analyzed using Pearson Chi-Square test. The results showed that mobile bank users were predominantly males. There is a growing trend among young, educated customers towards converting to mobile banking in Sri Lanka. The research outcomes suggested that all the six factors are statistically highly significant in influencing mobile banking adoption and attitude formation towards mobile banking in Sri Lanka. The major reasons for adopting mobile banking services are the accessibility and availability of services regardless of time and place. Over the 75 percent of the respondents mentioned that savings in time and effort and low financial costs of conducting mobile banking were advantageous. Issue of security was found to be the most important factor that motivated consumer adoption and attitude formation towards mobile banking. Main barriers to mobile banking were the lack of technological skills, the traditional cash‐carry banking culture, and the lack of awareness and insufficient guidance to using mobile banking.

Keywords: compatibility, complexity, mobile banking, observability, risk

Procedia PDF Downloads 174
3740 Framework Development of Carbon Management Software Tool in Sustainable Supply Chain Management of Indian Industry

Authors: Sarbjit Singh

Abstract:

This framework development explored the status of GSCM in manufacturing SMEs and concluded that there was a significant gap w.r.t carbon emissions measurement in the supply chain activities. The measurement of carbon emissions within supply chains is important green initiative toward its reduction. The majority of the SMEs were facing the problem to quantify the green house gas emissions in its supply chain & to make it a low carbon supply chain or GSCM. Thus, the carbon management initiatives were amalgamated with the supply chain activities in order to measure and reduce the carbon emissions, confirming the GHG protocol scopes. Henceforth, it covers the development of carbon management software (CMS) tool to quantify carbon emissions for effective carbon management. This tool is cheap and easy to use for the industries for the management of their carbon emissions within the supply chain.

Keywords: w.r.t carbon emissions, carbon management software, supply chain management, Indian Industry

Procedia PDF Downloads 437
3739 The Methods of Customer Satisfaction Measurement and Its Statistical Analysis towards Sales and Logistic Activities in Food Sector

Authors: Seher Arslankaya, Bahar Uludağ

Abstract:

Meeting the needs and demands of customers and pleasing the customers are important requirements for companies in food sectors where the growth of competition is significantly unpredictable. Customer satisfaction is also one of the key concepts which is mainly driven by wide range of customer preference and expectation upon products and services introduced and delivered to them. In order to meet the customer demands, the companies that engage in food sectors are expected to have a well-managed set of Total Quality Management (TQM), which sets out to improve quality of products and services; to reduce costs and to increase customer satisfaction by restructuring traditional management practices. It aims to increase customer satisfaction by meeting (their) customer expectations and requirements. The achievement would be determined with the help of customer satisfaction surveys, which is done to obtain immediate feedback and to provide quick responses. In addition, the surveys would also assist the making of strategic planning which helps to anticipate customer future needs and expectations. Meanwhile, periodic measurement of customer satisfaction would be a must because with the better understanding of customers perceptions from the surveys (done by questioners), the companies would have a clear idea to identify their own strengths and weaknesses that help the companies keep their loyal customers; to stand in comparison toward their competitors and map out their future progress and improvement. In this study, we propose a survey based on customer satisfaction measurement method and its statistical analysis for sales and logistic activities of food firms. Customer satisfaction would be discussed in details. Furthermore, after analysing the data derived from the questionnaire that applied to customers by using the SPSS software, various results obtained from the application would be presented. By also applying ANOVA test, the study would analysis the existence of meaningful differences between customer demographic proportion and their perceptions. The purpose of this study is also to find out requirements which help to remove the effects that decrease customer satisfaction and produce loyal customers in food industry. For this purpose, the customer complaints are collected. Additionally, comments and suggestions are done according to the obtained results of surveys, which would be useful for the making-process of strategic planning in food industry.

Keywords: customer satisfaction measurement and analysis, food industry, SPSS, TQM

Procedia PDF Downloads 228
3738 Prediction of Bubbly Plume Characteristics Using the Self-Similarity Model

Authors: Li Chen, Alex Skvortsov, Chris Norwood

Abstract:

Gas releasing into water can be found in for many industrial situations. This process results in the formation of bubbles and acoustic emission which depends upon the bubble characteristics. If the bubble creation rates (bubble volume flow rate) are of interest, an inverse method has to be used based on the measurement of acoustic emission. However, there will be sound attenuation through the bubbly plume which will influence the measurement and should be taken into consideration in the model. The sound transmission through the bubbly plume depends on the characteristics of the bubbly plume, such as the shape and the bubble distributions. In this study, the bubbly plume shape is modelled using a self-similarity model, which has been normally applied for a single phase buoyant plume. The prediction is compared with the experimental data. It has been found the model can be applied to a buoyant plume of gas-liquid mixture. The influence of the gas flow rate and discharge nozzle size is studied.

Keywords: bubbly plume, buoyant plume, bubble acoustics, self-similarity model

Procedia PDF Downloads 264
3737 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 118
3736 Component Interface Formalization in Robotic Systems

Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers

Abstract:

Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.

Keywords: CPS, robots, software architecture, interface, ROS, autopilot

Procedia PDF Downloads 66
3735 Surfactant Improved Heavy Oil Recovery in Sandstone Reservoirs by Wettability Alteration

Authors: Rabia Hunky, Hayat Kalifa, Bai

Abstract:

The wettability of carbonate reservoirs has been widely recognized as an important parameter in oil recovery by flooding technology. Many surfactants have been studied for this application. However, the importance of wettability alteration in sandstone reservoirs by surfactant has been poorly studied. In this paper, our recent study of the relationship between rock surface wettability and cumulative oil recovery for sandstone cores is reported. In our research, it has been found there is a good agreement between the wettability and oil recovery. Nonionic surfactants, Tomadol® 25-12 and Tomadol® 45-13, are very effective in wettability alteration of sandstone core surface from highly oil-wet conditions to water-wet conditions. By spontaneous imbibition test, Interfacial tension, and contact angle measurement these two surfactants exhibit the highest recovery of the synthetic oil made with heavy oil. Based on these experimental results, we can further conclude that the contact angle measurement and imbibition test can be used as rapid screening tools to identify better EOR surfactants to increase heavy oil recovery from sandstone reservoirs.

Keywords: EOR, oil gas, IOR, WC, IF, oil and gas

Procedia PDF Downloads 75
3734 A Configurational Approach to Understand the Effect of Organizational Structure on Absorptive Capacity: Results from PLS and fsQCA

Authors: Murad Ali, Anderson Konan Seny Kan, Khalid A. Maimani

Abstract:

Based on the theory of organizational design and the theory of knowledge, this study uses complexity theory to explain and better understand the causal impacts of various patterns of organizational structural factors stimulating absorptive capacity (ACAP). Organizational structure can be thought of as heterogeneous configurations where various components are often intertwined. This study argues that impact of the traditional variables which define a firm’s organizational structure (centralization, formalization, complexity and integration) on ACAP is better understood in terms of set-theoretic relations rather than correlations. This study uses a data sample of 347 from a multiple industrial sector in South Korea. The results from PLS-SEM support all the hypothetical relationships among the variables. However, fsQCA results suggest the possible configurations of centralization, formalization, complexity, integration, age, size, industry and revenue factors that contribute to high level of ACAP. The results from fsQCA demonstrate the usefulness of configurational approaches in helping understand equifinality in the field of knowledge management. A recent fsQCA procedure based on a modeling subsample and holdout subsample is use in this study to assess the predictive validity of the model under investigation. The same type predictive analysis is also made through PLS-SEM. These analyses reveal a good relevance of causal solutions leading to high level of ACAP. In overall, the results obtained from combining PLS-SEM and fsQCA are very insightful. In particular, they could help managers to link internal organizational structural with ACAP. In other words, managers may comprehend finely how different components of organizational structure can increase the level of ACAP. The configurational approach may trigger new insights that could help managers prioritize selection criteria and understand the interactions between organizational structure and ACAP. The paper also discusses theoretical and managerial implications arising from these findings.

Keywords: absorptive capacity, organizational structure, PLS-SEM, fsQCA, predictive analysis, modeling subsample, holdout subsample

Procedia PDF Downloads 308
3733 Simulation Model for Optimizing Energy in Supply Chain Management

Authors: Nazli Akhlaghinia, Ali Rajabzadeh Ghatari

Abstract:

In today's world, with increasing environmental awareness, firms are facing severe pressure from various stakeholders, including the government and customers, to reduce their harmful effects on the environment. Over the past few decades, the increasing effects of global warming, climate change, waste, and air pollution have increased the global attention of experts to the issue of the green supply chain and led them to the optimal solution for greenery. Green supply chain management (GSCM) plays an important role in motivating the sustainability of the organization. With increasing environmental concerns, the main objective of the research is to use system thinking methodology and Vensim software for designing a dynamic system model for green supply chain and observing behaviors. Using this methodology, we look for the effects of a green supply chain structure on the behavioral dynamics of output variables. We try to simulate the complexity of GSCM in a period of 30 months and observe the complexity of behaviors of variables including sustainability, providing green products, and reducing energy consumption, and consequently reducing sample pollution.

Keywords: supply chain management, green supply chain management, system dynamics, energy consumption

Procedia PDF Downloads 116
3732 Improvement of Camera Calibration Based on the Relationship between Focal Length and Aberration Coefficient

Authors: Guorong Sui, Xingwei Jia, Chenhui Yin, Xiumin Gao

Abstract:

In the processing of camera-based high precision and non-contact measurement, the geometric-optical aberration is always inevitably disturbing the measuring system. Moreover, the aberration is different with the different focal length, which will increase the difficulties of the system’s calibration. Therefore, to understand the relationship between the focal length as a function of aberration properties is a very important issue to the calibration of the measuring systems. In this study, we propose a new mathematics model, which is based on the plane calibration method by Zhang Zhengyou, and establish a relationship between the focal length and aberration coefficient. By using the mathematics model and carefully modified compensation templates, the calibration precision of the system can be dramatically improved. The experiment results show that the relative error is less than 1%. It is important for optoelectronic imaging systems that apply to measure, track and position by changing the camera’s focal length.

Keywords: camera calibration, aberration coefficient, vision measurement, focal length, mathematics model

Procedia PDF Downloads 336
3731 Modeling of Particle Reduction and Volatile Compounds Profile during Chocolate Conching by Electronic Nose and Genetic Programming (GP) Based System

Authors: Juzhong Tan, William Kerr

Abstract:

Conching is one critical procedure in chocolate processing, where special flavors are developed, and smooth mouse feel the texture of the chocolate is developed due to particle size reduction of cocoa mass and other additives. Therefore, determination of the particle size and volatile compounds profile of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products. Currently, precise particle size measurement is usually done by laser scattering which is expensive and inaccessible to small/medium size chocolate manufacturers. Also, some other alternatives, such as micrometer and microscopy, can’t provide good measurements and provide little information. Volatile compounds analysis of cocoa during conching, has similar problems due to its high cost and limited accessibility. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was inserted to a conching machine and was used to monitoring the volatile compound profile of chocolate during the conching. A model correlated volatile compounds profiles along with factors including the content of cocoa, sugar, and the temperature during the conching to particle size of chocolate particles by genetic programming was established. The model was used to predict the particle size reduction of chocolates with different cocoa mass to sugar ratio (1:2, 1:1, 1.5:1, 2:1) at 8 conching time (15min, 30min, 1h, 1.5h, 2h, 4h, 8h, and 24h). And the predictions were compared to laser scattering measurements of the same chocolate samples. 91.3% of the predictions were within the range of later scatting measurement ± 5% deviation. 99.3% were within the range of later scatting measurement ± 10% deviation.

Keywords: cocoa bean, conching, electronic nose, genetic programming

Procedia PDF Downloads 228
3730 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 303
3729 Scalable Systolic Multiplier over Binary Extension Fields Based on Two-Level Karatsuba Decomposition

Authors: Chiou-Yng Lee, Wen-Yo Lee, Chieh-Tsai Wu, Cheng-Chen Yang

Abstract:

Shifted polynomial basis (SPB) is a variation of polynomial basis representation. SPB has potential for efficient bit-level and digit-level implementations of multiplication over binary extension fields with subquadratic space complexity. For efficient implementation of pairing computation with large finite fields, this paper presents a new SPB multiplication algorithm based on Karatsuba schemes, and used that to derive a novel scalable multiplier architecture. Analytical results show that the proposed multiplier provides a trade-off between space and time complexities. Our proposed multiplier is modular, regular, and suitable for very-large-scale integration (VLSI) implementations. It involves less area complexity compared to the multipliers based on traditional decomposition methods. It is therefore, more suitable for efficient hardware implementation of pairing based cryptography and elliptic curve cryptography (ECC) in constraint driven applications.

Keywords: digit-serial systolic multiplier, elliptic curve cryptography (ECC), Karatsuba algorithm (KA), shifted polynomial basis (SPB), pairing computation

Procedia PDF Downloads 338
3728 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets

Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor Sookia

Abstract:

In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that 'fat-tailedness' alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.

Keywords: extreme value theory, financial crisis 2008, value at risk, frontier markets

Procedia PDF Downloads 260
3727 An Experimental Study on the Measurement of Fuel to Air Ratio Using Flame Chemiluminescence

Authors: Sewon Kim, Chang Yeop Lee, Minjun Kwon

Abstract:

This study is aiming at establishing the relationship between the optical signal of flame and an equivalent ratio of flame. In this experiment, flame optical signal in a furnace is measured using photodiode. The combustion system which is composed of metal fiber burner and vertical furnace and flame chemiluminescence is measured at various experimental conditions. In this study, the flame chemiluminescence of laminar premixed flame is measured by using commercially available photodiode. It is experimentally investigated the relationship between equivalent ratio and photodiode signal. In addition, The strategy of combustion control method is proposed by using the optical signal and fuel pressure. The results showed that certain relationship between optical data of photodiode and equivalence ratio exists and this leads to the successful application of this system for instantaneous measurement of equivalence ration of the combustion system.

Keywords: flame chemiluminescence, photo diode, equivalence ratio, combustion control

Procedia PDF Downloads 379
3726 Fracture Crack Monitoring Using Digital Image Correlation Technique

Authors: B. G. Patel, A. K. Desai, S. G. Shah

Abstract:

The main of objective of this paper is to develop new measurement technique without touching the object. DIC is advance measurement technique use to measure displacement of particle with very high accuracy. This powerful innovative technique which is used to correlate two image segments to determine the similarity between them. For this study, nine geometrically similar beam specimens of different sizes with (steel fibers and glass fibers) and without fibers were tested under three-point bending in a closed loop servo-controlled machine with crack mouth opening displacement control with a rate of opening of 0.0005 mm/sec. Digital images were captured before loading (unreformed state) and at different instances of loading and were analyzed using correlation techniques to compute the surface displacements, crack opening and sliding displacements, load-point displacement, crack length and crack tip location. It was seen that the CMOD and vertical load-point displacement computed using DIC analysis matches well with those measured experimentally.

Keywords: Digital Image Correlation, fibres, self compacting concrete, size effect

Procedia PDF Downloads 366
3725 Assessing Perinatal Mental Illness during the COVID-19 Pandemic: A Review of Measurement Tools

Authors: Mya Achike

Abstract:

Background and Significance: Perinatal mental illness covers a wide range of conditions and has a huge influence on maternal-child health. Issues and challenges with perinatal mental health have been associated with poor pregnancy, birth, and postpartum outcomes. It is estimated that one out of five new and expectant mothers experience some degree of perinatal mental illness, which makes this a hugely significant health outcome. Certain factors increase the maternal risk for mental illness. Challenges related to poverty, migration, extreme stress, exposure to violence, emergency and conflict situations, natural disasters, and pandemics can exacerbate mental health disorders. It is widely expected that perinatal mental health is being negatively affected during the present COVID-19 pandemic. Methods: A review of studies that reported a measurement tool to assess perinatal mental health outcomes during the COVID-19 pandemic was conducted following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. PubMed, CINAHL, and Google Scholar were used to search for peer-reviewed studies published after late 2019, in accordance with the emergence of the virus. The search resulted in the inclusion of ten studies. Approach to measure health outcome: The main approach to measure perinatal mental illness is the use of self-administered, validated questionnaires, usually in the clinical setting. Summary: Widespread use of these tools has afforded the clinical and research communities the ability to identify and support women who may be suffering from mental illness disorders during a pandemic. More research is needed to validate tools in other vulnerable, perinatal populations.

Keywords: mental health during covid, perinatal mental health, perinatal mental health measurement tools, perinatal mental health tools

Procedia PDF Downloads 109
3724 Dynamic Measurement System Modeling with Machine Learning Algorithms

Authors: Changqiao Wu, Guoqing Ding, Xin Chen

Abstract:

In this paper, ways of modeling dynamic measurement systems are discussed. Specially, for linear system with single-input single-output, it could be modeled with shallow neural network. Then, gradient based optimization algorithms are used for searching the proper coefficients. Besides, method with normal equation and second order gradient descent are proposed to accelerate the modeling process, and ways of better gradient estimation are discussed. It shows that the mathematical essence of the learning objective is maximum likelihood with noises under Gaussian distribution. For conventional gradient descent, the mini-batch learning and gradient with momentum contribute to faster convergence and enhance model ability. Lastly, experimental results proved the effectiveness of second order gradient descent algorithm, and indicated that optimization with normal equation was the most suitable for linear dynamic models.

Keywords: dynamic system modeling, neural network, normal equation, second order gradient descent

Procedia PDF Downloads 102
3723 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation

Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan

Abstract:

Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.

Keywords: binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform

Procedia PDF Downloads 281
3722 Spaces of Interpretation: Personal Space

Authors: Yehuda Roth

Abstract:

In quantum theory, a system’s time evolution is predictable unless an observer performs measurement, as the measurement process can randomize the system. This randomness appears when the measuring device does not accurately describe the measured item, i.e., when the states characterizing the measuring device appear as a superposition of those being measured. When such a mismatch occurs, the measured data randomly collapse into a single eigenstate of the measuring device. This scenario resembles the interpretation process in which the observer does not experience an objective reality but interprets it based on preliminary descriptions initially ingrained into his/her mind. This distinction is the motivation for the present study in which the collapse scenario is regarded as part of the interpretation process of the observer. By adopting the formalism of the quantum theory, we present a complete mathematical approach that describes the interpretation process. We demonstrate this process by applying the proposed interpretation formalism to the ambiguous image "My wife and mother-in-law" to identify whether a woman in the picture is young or old.

Keywords: quantum-like interpretation, ambiguous image, determination, quantum-like collapse, classified representation

Procedia PDF Downloads 78
3721 Defence Ethics : A Performance Measurement Framework for the Defence Ethics Program

Authors: Allyson Dale, Max Hlywa

Abstract:

The Canadian public expects the highest moral standards from Canadian Armed Forces (CAF) members and Department of National Defence (DND) employees. The Chief, Professional Conduct and Culture (CPCC) stood up in April 2021 with the mission of ensuring that the defence culture and members’ conduct are aligned with the ethical principles and values that the organization aspires towards. The Defence Ethics Program (DEP), which stood up in 1997, is a values-based ethics program for individuals and organizations within the DND/CAF and now falls under CPCC. The DEP is divided into five key functional areas, including policy, communications, collaboration, training and education, and advice and guidance. The main focus of the DEP is to foster an ethical culture within defence so that members and organizations perform to the highest ethical standards. The measurement of organizational ethics is often complex and challenging. In order to monitor whether the DEP is achieving its intended outcomes, a performance measurement framework (PMF) was developed using the Director General Military Personnel Research and Analysis (DGMPRA) PMF development process. This evidence-based process is based on subject-matter expertise from the defence team. The goal of this presentation is to describe each stage of the DGMPRA PMF development process and to present and discuss the products of the DEP PMF (e.g., logic model). Specifically, first, a strategic framework was developed to provide a high-level overview of the strategic objectives, mission, and vision of the DEP. Next, Key Performance Questions were created based on the objectives in the strategic framework. A logic model detailing the activities, outputs (what is produced by the program activities), and intended outcomes of the program were developed to demonstrate how the program works. Finally, Key Performance Indicators were developed based on both the intended outcomes in the logic model and the Key Performance Questions in order to monitor program effectiveness. The Key Performance Indicators measure aspects of organizational ethics such as ethical conduct and decision-making, DEP collaborations, and knowledge and awareness of the Defence Ethics Code while leveraging ethics-related items from multiple DGMPRA surveys where appropriate.

Keywords: defence ethics, ethical culture, organizational performance, performance measurement framework

Procedia PDF Downloads 80
3720 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 385
3719 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 129
3718 Evidence of Scientific-Ness of Scriptures

Authors: Shyam Sunder Gupta

Abstract:

GOD is the infinite source of knowledge and from time to time, as per the need of mankind, keeps revealing a portion of HIS knowledge as” Words” through chosen messengers. In the course of time, ” Words” get converted into scripture. This process of conversion happens after a long gap of time and with the involvement of a large number of persons; and unintentionally scientific and other types of errors get into scriptures; otherwise scriptures are in reality, truly scientific. Description of Chronology of life in the womb (Fetal Development), Five types of rotations of celestial bodies, Speed of the Sun, Rotation of Universe, Multiple verses, Spherical shape of the earth, Measurement of time, Classification of species by nature of birth, Evolution process of non-living matter and living species, etc., most convincing prove that scriptures are truly scientific. In fact, there are many facts for which, till date, science has not found answers, but are available in scriptures; like source of Singularity from which the Big Bang took place and the Universe was created, the Infinite number of Universes, the most fundamental particle, Param-anu( God particle), fundamental of measurement of time and many more.

Keywords: Big Bang, God Particle, Scientific, Scriptures, Singularity, Universe

Procedia PDF Downloads 16
3717 Performance Evaluation of the CareSTART S1 Analyzer for Quantitative Point-Of-Care Measurement of Glucose-6-Phosphate Dehydrogenase Activity

Authors: Haiyoung Jung, Mi Joung Leem, Sun Hwa Lee

Abstract:

Background & Objective: Glucose-6-phosphate dehydrogenase (G6PD) deficiency is a genetic abnormality that results in an inadequate amount of G6PD, leading to increased susceptibility of red blood cells to reactive oxygen species and hemolysis. The present study aimed to evaluate the careSTARTTM S1 analyzer for measuring G6PD activity to hemoglobin (Hb) ratio. Methods: Precision for G6PD activity and hemoglobin measurement was evaluated using control materials with two levels on five repeated runs per day for five days. The analytic performance of the careSTARTTM S1 analyzer was compared with spectrophotometry in 40 patient samples. Reference ranges suggested by the manufacturer were validated in 20 healthy males and females each. Results: The careSTARTTM S1 analyzer demonstrated precision of 6.0% for low-level (14~45 U/dL) and 2.7% for high-level (60~90 U/dL) control in G6PD activity, and 1.4% in hemoglobin (7.9~16.3 u/g Hb). A comparison study of G6PD to Hb ratio between the careSTARTTM S1 analyzer and spectrophotometry showed an average difference of 29.1% with a positive bias of the careSTARTTM S1 analyzer. All normal samples from the healthy population were validated for the suggested reference range for males (≥2.19 U/g Hb) and females (≥5.83 U/g Hb). Conclusion: The careSTARTTM S1 analyzer demonstrated good analytical performance and can replace the current spectrophotometric measurement of G6PD enzyme activity. In the aspect of the management of clinical laboratories, it can be a reasonable option as a point-of-care analyzer with minimal handling of samples and reagents, in addition to the automatic calculation of the ratio of measured G6PD activity and Hb concentration, to minimize any clerical errors involved with manual calculation.

Keywords: POCT, G6PD, performance evaluation, careSTART

Procedia PDF Downloads 48
3716 Investigating a Modern Accident Analysis Model for Textile Building Fires through Numerical Reconstruction

Authors: Mohsin Ali Shaikh, Weiguo Song, Rehmat Karim, Muhammad Kashan Surahio, Muhammad Usman Shahid

Abstract:

Fire investigations face challenges due to the complexity of fire development, and real-world accidents lack repeatability, making it difficult to apply standardized approaches. The unpredictable nature of fires and the unique conditions of each incident contribute to the complexity, requiring innovative methods and tools for effective analysis and reconstruction. This study proposes to provide the modern accident analysis model through numerical reconstruction for fire investigation in textile buildings. This method employs computer simulation to enhance the overall effectiveness of textile-building investigations. The materials and evidence collected from past incidents reconstruct fire occurrences, progressions, and catastrophic processes. The approach is demonstrated through a case study involving a tragic textile factory fire in Karachi, Pakistan, which claimed 257 lives. The reconstruction method proves invaluable for determining fire origins, assessing losses, establishing accountability, and, significantly, providing preventive insights for complex fire incidents.

Keywords: fire investigation, numerical simulation, fire safety, fire incident, textile building

Procedia PDF Downloads 44
3715 Mending Broken Fences Policing: Developing the Intelligence-Led/Community-Based Policing Model(IP-CP) and Quality/Quantity/Crime(QQC) Model

Authors: Anil Anand

Abstract:

Despite enormous strides made during the past decade, particularly with the adoption and expansion of community policing, there remains much that police leaders can do to improve police-public relations. The urgency is particularly evident in cities across the United States and Europe where an increasing number of police interactions over the past few years have ignited large, sometimes even national, protests against police policy and strategy, highlighting a gap between what police leaders feel they have archived in terms of public satisfaction, support, and legitimacy and the perception of bias among many marginalized communities. The decision on which one policing strategy is chosen over another, how many resources are allocated, and how strenuously the policy is applied resides primarily with the police and the units and subunits tasked with its enforcement. The scope and opportunity for police officers in impacting social attitudes and social policy are important elements that cannot be overstated. How do police leaders, for instance, decide when to apply one strategy—say community-based policing—over another, like intelligence-led policing? How do police leaders measure performance and success? Should these measures be based on quantitative preferences over qualitative, or should the preference be based on some other criteria? And how do police leaders define, allow, and control discretionary decision-making? Mending Broken Fences Policing provides police and security services leaders with a model based on social cohesion, that incorporates intelligence-led and community policing (IP-CP), supplemented by a quality/quantity/crime (QQC) framework to provide a four-step process for the articulable application of police intervention, performance measurement, and application of discretion.

Keywords: social cohesion, quantitative performance measurement, qualitative performance measurement, sustainable leadership

Procedia PDF Downloads 272