Search results for: explanations for the probable causes of the errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1404

Search results for: explanations for the probable causes of the errors

684 Real-Time Recognition of the Terrain Configuration to Improve Driving Stability for Unmanned Robots

Authors: Bongsoo Jeon, Jayoung Kim, Jihong Lee

Abstract:

Methods for measuring or estimating of ground shape by a laser range finder and a vision sensor (exteroceptive sensors) have critical weakness in terms that these methods need prior database built to distinguish acquired data as unique surface condition for driving. Also, ground information by exteroceptive sensors does not reflect the deflection of ground surface caused by the movement of UGVs. Therefore, this paper proposes a method of recognizing exact and precise ground shape using Inertial Measurement Unit (IMU) as a proprioceptive sensor. In this paper, firstly this method recognizes attitude of a robot in real-time using IMU and compensates attitude data of a robot with angle errors through analysis of vehicle dynamics. This method is verified by outdoor driving experiments of a real mobile robot.

Keywords: inertial measurement unit, laser range finder, real-time recognition of the ground shape, proprioceptive sensor

Procedia PDF Downloads 288
683 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection

Authors: Ashkan Zakaryazad, Ekrem Duman

Abstract:

A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.

Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent

Procedia PDF Downloads 475
682 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning

Authors: Arun Sanjel, Greg Speegle

Abstract:

Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.

Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC

Procedia PDF Downloads 110
681 Fluid Structure Interaction of Flow and Heat Transfer around a Microcantilever

Authors: Khalil Khanafer

Abstract:

This study emphasizes on analyzing the effect of flow conditions and the geometric variation of the microcantilever’s bluff body on the microcantilever detection capabilities within a fluidic device using a finite element fluid-structure interaction model. Such parameters include inlet velocity, flow direction, and height of the microcantilever’s supporting system within the fluidic cell. The transport equations are solved using a finite element formulation based on the Galerkin method of weighted residuals. For a flexible microcantilever, a fully coupled fluid-structure interaction (FSI) analysis is utilized and the fluid domain is described by an Arbitrary-Lagrangian–Eulerian (ALE) formulation that is fully coupled to the structure domain. The results of this study showed a profound effect on the magnitude and direction of the inlet velocity and the height of the bluff body on the deflection of the microcantilever. The vibration characteristics were also investigated in this study. This work paves the road for researchers to design efficient microcantilevers that display least errors in the measurements.

Keywords: fluidic cell, FSI, microcantilever, flow direction

Procedia PDF Downloads 374
680 The Effect of Traffic Load on the Maximum Response of a Cable-Stayed Bridge under Blast Loads

Authors: S. K. Hashemi, M. A. Bradford, H. R. Valipour

Abstract:

The Recent collapse of bridges has raised the awareness about safety and robustness of bridges subjected to extreme loading scenarios such as intentional/unintentional blast loads. The air blast generated by the explosion of bombs or fuel tankers leads to high-magnitude short-duration loading scenarios that can cause severe structural damage and loss of critical structural members. Hence, more attentions need to put towards bridge structures to develop guidelines to increase the resistance of such structures against the probable blast. Recent advancements in numerical methods have brought about the viable and cost effective facilities to simulate complicated blast scenarios and subsequently provide useful reference for safeguarding design of critical infrastructures. In the previous studies common bridge responses to blast load, the traffic load is sometimes not included in the analysis. Including traffic load will increase the axial compression in bridge piers especially when the axial load is relatively small. Traffic load also can reduce the uplift of girders and deck when the bridge experiences under deck explosion. For more complicated structures like cable-stayed or suspension bridges, however, the effect of traffic loads can be completely different. The tension in the cables increase and progressive collapse is likely to happen while traffic loads exist. Accordingly, this study is an attempt to simulate the effect of traffic load cases on the maximum local and global response of an entire cable-stayed bridge subjected to blast loadings using LS-DYNA explicit finite element code. The blast loads ranged from small to large explosion placed at different positions above the deck. Furthermore, the variation of the traffic load factor in the load combination and its effect on the dynamic response of the bridge under blast load is investigated.

Keywords: blast, cable-stayed bridge, LS-DYNA, numerical, traffic load

Procedia PDF Downloads 333
679 Aboriginal Head and Neck Cancer Patients Have Different Patterns of Metastatic Involvement, and Have More Advanced Disease at Diagnosis

Authors: Kim Kennedy, Daren Gibson, Stephanie Flukes, Chandra Diwakarla, Lisa Spalding, Leanne Pilkington, Andrew Redfern

Abstract:

Introduction: The mortality gap in Aboriginal Head and Neck Cancer is well known, but the reasons for poorer survival are not well established. Aim: We aimed to evaluate the locoregional and metastatic involvement, and stage at diagnosis, in Aboriginal compared with non-Aboriginal patients. Methods: We performed a retrospective cohort analysis of 320 HNC patients from a single centre in Western Australia, identifying 80 Aboriginal patients and 240 non-Aboriginal patients matched on a 1:3 ratio by sites, histology, rurality, and age. We collected data on the patient characteristics, tumour features, regions involved, stage at diagnosis, treatment history, and survival and relapse patterns, including sites of metastatic and locoregional involvement. Results: Aboriginal patients had a significantly higher incidence of lung metastases (26.3% versus 13.7%, p=0.009). Aboriginal patients also had a numerically but non-statistically significant higher incidence of thoracic nodal involvement (10% vs 5.8%) and malignant pleural effusions (3.8% vs 2.5%). Aboriginal patients also had a numerically but not statistically significantly higher incidence of adrenal and bony involvement. Interestingly, non-Aboriginal patients had an increased rate of cutaneous (2.1% vs 0%) and liver metastases (4.6% vs 2.5%) compared with Aboriginal patients. In terms of locoregional involvement, Aboriginal patients were more than twice as likely to have contralateral neck involvement (58.8% vs 24.2%, p<0.00001), and 30% more likely to have ipsilateral neck lymph node involvement (78.8% vs 60%, p=0.002) than non-Aboriginal patients. Aboriginal patients had significantly more advanced disease at diagnosis (p=0.008). Aboriginal compared with non-Aboriginal patients were less likely to present with stage I (7.5% vs 22.5%), stage II (11.3% vs 13.8%), or stage III disease (13.8% vs 17.1%), and more likely to present with more advanced stage IVA (42.5% vs 34.6%), stage IVB (15% vs 7.1%), or stage IVC (10% vs 5%) disease (p=0.008). Number of regions of disease involvement was higher in Aboriginal patients (median 3, mean 3.64, range 1-10) compared with non-Aboriginal patients (median 2, mean 2.80, range 1-12). Conclusion: Aboriginal patients had a significantly higher incidence of lung metastases, and significantly more frequent involvement of ipsilateral and contralateral neck lymph nodes. Aboriginal patients also had significantly more advanced disease at presentation with a higher stage at diagnosis. We are performing further analyses to investigate explanations for these findings.

Keywords: head and neck cancer, Aboriginal, metastases, locoregional, pattern of relapse, sites of disease

Procedia PDF Downloads 70
678 Poverty Alleviation and Agricultural Management Policies in Nasarawa State of Nigeria: Lessons from the Roots and Tuber Crops Expansion for Increased Food Production (1996-2011)

Authors: Yahaya Abdullahi Adadu, Canice Erunke Esidene

Abstract:

The problems of socio-economic development have been a major challenge bedeviling the Nigerian post-colonial state since her political independence from Britain in October I,1960. Critics have argued that the dilemma of Nigeria’s economic survival started since the early 1970s when the agricultural sector which supposedly was the economic mainstay has been literally substituted with the gains of the oil petro-dollars coming from the foreign exchange earnings. Agriculture therefore, which used to be a major player in terms of human and national upliftment in Nigeria have been given a back seat while oil and gas has taken over the front burner in virtually every aspect of Nigeria’s national life. This study is therefore an exposition of the efforts of the Nasarawa state government in reversing the dangerous trend in which the over reliance on oil wealth has caused to persons, individuals and groups in terms of the prevailing levels of poverty and other attendant vices therein. The study focuses on the management policies of the various regimes in the state since its inception in 1996, with particular reference to the regime types-military and civilian alike in propelling the needed policy change, which could transform the economy in line with international best practices. Particular emphasis will be paid to the BADA-KOSHI agricultural scheme whose interest was to recover the lost glory of rural agriculture through series of roots and tuber expansion, and particularly such crops as yam minissetts, cassava, sweet potatoes and coco-yam, respectively. The paper covers the period between 1996 -2011, a period considered to be critical in the agricultural revolution of the state. The study adopts a theoretical approach via secondary methods of analysis for the efficient explanations of the burning issues under consideration. The paper sums up with policy recommendations and conclusion.

Keywords: poverty, agriculture, Badakoshi, rural policy management

Procedia PDF Downloads 446
677 Mitigation of Electromagnetic Interference Generated by GPIB Control-Network in AC-DC Transfer Measurement System

Authors: M. M. Hlakola, E. Golovins, D. V. Nicolae

Abstract:

The field of instrumentation electronics is undergoing an explosive growth, due to its wide range of applications. The proliferation of electrical devices in a close working proximity can negatively influence each other’s performance. The degradation in the performance is due to electromagnetic interference (EMI). This paper investigates the negative effects of electromagnetic interference originating in the General Purpose Interface Bus (GPIB) control-network of the ac-dc transfer measurement system. Remedial measures of reducing measurement errors and failure of range of industrial devices due to EMI have been explored. The ac-dc transfer measurement system was analyzed for the common-mode (CM) EMI effects. Further investigation of coupling path as well as more accurate identification of noise propagation mechanism has been outlined. To prevent the occurrence of common-mode (ground loops) which was identified between the GPIB system control circuit and the measurement circuit, a microcontroller-driven GPIB switching isolator device was designed, prototyped, programmed and validated. This mitigation technique has been explored to reduce EMI effectively.

Keywords: CM, EMI, GPIB, ground loops

Procedia PDF Downloads 289
676 Examining How Employee Training and Development Contribute to the Favourable Results of a Business Entity: A Conceptual Analysis

Authors: Paul Saah, Charles Mbohwa, Nelson Sizwe Madonsela

Abstract:

Organisations that want to have a competitive edge over their rivals in their industry are becoming more and more aware of the value of staff training and development programs. This conceptual study's primary goal is to determine how staff development and training affect an organization's ability to succeed. A non-empirical methodological approach was chosen because this was a conceptual study, and a thorough literature analysis was conducted to determine the contribution of staff training and development to the performance of a commercial organization. Twenty of the 100 publications about employee training and development that were obtained from Google Scholar and regarded to be more pertinent were examined for this study. The impact of employee training and development in an organization was found and documented during the analyses. According to the study's findings, some of the major advantages of staff development and training include greater productivity, the discovery of employee potential, job satisfaction, the development of skills, less supervision, a decrease in turnover and absenteeism as well as less supervision and reduction of errors and accidents. The findings show that organisations that make significant investments in the training and development of their personnel are more likely to succeed than those who do not.

Keywords: impact, employment, training and development, success, business, organization

Procedia PDF Downloads 71
675 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 227
674 Capacities of Early Childhood Education Professionals for the Prevention of Social Exclusion of Children

Authors: Dejana Bouillet, Vlatka Domović

Abstract:

Both policymakers and researchers recognize that participating in early childhood education and care (ECEC) is useful for all children, especially for those who are exposed to the high risk of social exclusion. Social exclusion of children is understood as a multidimensional construct including economic, social, cultural, health, and other aspects of disadvantage and deprivation, which individually or combined can have an unfavorable effect on the current life and development of a child, as well as on the child’s development and on disadvantaged life chances in adult life. ECEC institutions should be able to promote educational approaches that portray developmental, cultural, language, and other diversity amongst children. However, little is known about the ways in which Croatian ECEC institutions recognize and respect the diversity of children and their families and how they respond to their educational needs. That is why this paper is dedicated to the analysis of the capacities of ECEC professionals to respond to the demands of educational needs of this very diverse group of children and their families. The results obtained in the frame of the project “Models of response to educational needs of children at risk of social exclusion in ECEC institutions,” funded by the Croatian Science Foundation, will be presented. The research methodology arises from explanations of educational processes and risks of social exclusion as a complex and heterogeneous phenomenon. The preliminary results of the qualitative data analysis of educational practices regarding capacities to identify and appropriately respond to the requirements of children at risk of social exclusion will be presented. The data have been collected by interviewing educational staff in 10 Croatian ECEC institutions (n = 10). The questions in the interviews were related to various aspects of inclusive institutional policy, culture, and practices. According to the analysis, it is possible to conclude that Croatian ECEC professionals are still faced with great challenges in the process of implementation of inclusive policies, culture, and practices. There are several baselines of this conclusion. The interviewed educational professionals are not familiar enough with the whole complexity and diversity of needs of children at risk of social exclusion, and the ECEC institutions do not have enough resources to provide all interventions that these children and their families need.

Keywords: children at risk of social exclusion, ECEC professionals, inclusive policies, culture and practices, quallitative analysis

Procedia PDF Downloads 115
673 The Gap between Elite Catholic Education and Inclusive Education

Authors: Viktorija Voidogaitė

Abstract:

Catholic education is based on the belief that every human being is created in the image and likeness of God. It is also influenced by the idea that the Kingdom of Heaven belongs to the humble and vulnerable. These principles emphasize the importance of serving the most vulnerable members of the Church community and promoting inclusivity without discrimination. This perspective emphasizes the need to protect the weakest members with compassion. However, realizing such an ideal in practice proves challenging, as the shortcomings and errors prevalent in any society often stem from the actions of Christians within that society. The evolution of these connections is observed throughout the historical development of Catholic education. In some European countries, Catholic education has become elitist, with limited room for inclusivity. This creates a conspicuous gap between the principles of the Evangelical community and elite Catholic schools and gymnasiums. Some schools appear to be most inclined to educate only those students who best align with their profile, leaving those needing assistance on the margins. As we advance into the third decade of the 21st century, there emerges a fundamental consideration: whether individuals who can assist the underprivileged and the infirm are being emphasized. Yet, it remains an open question whether these individuals will also possess the willingness and capability to construct a community or society that is inclusive and accessible to all.

Keywords: inclusion, Catholic education, inclusive education, becoming

Procedia PDF Downloads 65
672 Historical Analysis of the Evolution of Swiss Identity and the Successful Integration of Multilingualism into the Swiss Concept of Nationhood

Authors: James Beringer

Abstract:

Switzerland’s ability to forge a strong national identity across linguistic barriers has long been of interest to nationalism scholars. This begs the question of how this has been achieved, given that traditional explanations of luck or exceptionalism appear highly reductionist. This paper evaluates the theory that successful Swiss management of linguistic diversity stems from the strong integration of multilingualism into Swiss national identity. Using archival analysis of Swiss government records, historical accounts of prominent Swiss citizens, as well as secondary literature concerning the fundamental aspects of Swiss national identity, this paper charts the historical evolution of Swiss national identity. It explains how multilingualism was deliberately and successfully integrated into Swiss national identity as a response to political fragmentation along linguistic lines during the First World War. Its primary conclusions are the following. Firstly, the earliest foundations of Swiss national identity were purposefully removed from any association with a single national language. This produced symbols, myths, and values -such as a strong commitment to communalism, the imagery of the Swiss natural landscape, and the use of Latin expressions, which can be adopted across Swiss linguistic groups. Secondly, the First World War triggered a turning point in the evolution of Swiss national identity. The fundamental building blocks proved insufficient in preventing political fractures amongst linguistic lines, as each Swiss linguistic group gravitated towards its linguistic neighbours within Europe. To avoid a repeat of such fragmentation, a deliberate effort was made to fully integrate multilingualism as a fundamental aspect of Swiss national identity. Existing natural symbols, such as the St Gotthard Mountains, were recontextualized in order to become associated with multilingualism. The education system was similarly reformed to reflect the unique multilingual nature of the Swiss nation. The successful result of this process can be readily observed in polls and surveys, with large segments of the Swiss population highlighting multilingualism as a uniquely Swiss characteristic, indicating the symbiotic connection between multilingualism and the Swiss nation.

Keywords: language's role in identity formation, multilingualism in nationalism, national identity formation, Swiss national identity history

Procedia PDF Downloads 191
671 Numerical Method of Heat Transfer in Fin Profiles

Authors: Beghdadi Lotfi, Belkacem Abdellah

Abstract:

In this work, a numerical method is proposed in order to solve the thermal performance problems of heat transfer of fins surfaces. The bidimensional temperature distribution on the longitudinal section of the fin is calculated by restoring to the finite volumes method. The heat flux dissipated by a generic profile fin is compared with the heat flux removed by the rectangular profile fin with the same length and volume. In this study, it is shown that a finite volume method for quadrilaterals unstructured mesh is developed to predict the two dimensional steady-state solutions of conduction equation, in order to determine the sinusoidal parameter values which optimize the fin effectiveness. In this scheme, based on the integration around the polygonal control volume, the derivatives of conduction equation must be converted into closed line integrals using same formulation of the Stokes theorem. The numerical results show good agreement with analytical results. To demonstrate the accuracy of the method, the absolute and root-mean square errors versus the grid size are examined quantitatively.

Keywords: Stokes theorem, unstructured grid, heat transfer, complex geometry

Procedia PDF Downloads 406
670 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 103
669 Velocity Logs Error Reduction for In-Service Calibration of Vessel Performance Indicators

Authors: Maria Tsompanoglou, Dimitris Armenis

Abstract:

Vessel behavior in different operational and weather conditions constitutes the main area of interest for the ship operator. Ship speed and fuel consumption are the most decisive parameters in this respect, as their correlation provides information about the economic and environmental efficiency of the vessel, becoming the basis of decision making in terms of maintenance and trading. In the analysis of vessel operational profile for the evaluation of fuel consumption and the equivalent CO2 emissions footprint, the indications of Speed Through Water are widely used. The seasonal and regional variations in seawater characteristics, which are available nowadays, can provide the basis for accurate estimation of the errors in Speed Through Water indications at any time. Accuracy in the speed value on a route basis can enable operator identify the ship fuel and propulsion efficiency and proceed with improvements. This paper discusses case studies, where the actual vessel speed was corrected by a post-processing algorithm. The effects of the vessel correction to standard Key Performance Indicators, as well as operational findings not identified earlier, are also discussed.

Keywords: data analytics, MATLAB, vessel performance monitoring, speed through water

Procedia PDF Downloads 302
668 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 48
667 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process

Authors: Hong-Ming Chen

Abstract:

This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.

Keywords: optimization, interest rate model, jump process, deterministic

Procedia PDF Downloads 161
666 Prediction of Positive Cloud-to-Ground Lightning Striking Zones for Charged Thundercloud Based on Line Charge Model

Authors: Surajit Das Barman, Rakibuzzaman Shah, Apurv Kumar

Abstract:

Bushfire is known as one of the ascendant factors to create pyrocumulus thundercloud that causes the ignition of new fires by pyrocumulonimbus (pyroCb) lightning strikes and creates major losses of lives and property worldwide. A conceptual model-based risk planning would be beneficial to predict the lightning striking zones on the surface of the earth underneath the pyroCb thundercloud. PyroCb thundercloud can generate both positive cloud-to-ground (+CG) and negative cloud-to-ground (-CG) lightning in which +CG tends to ignite more bushfires and cause massive damage to nature and infrastructure. In this paper, a simple line charge structured thundercloud model is constructed in 2-D coordinates using the method of image charge to predict the probable +CG lightning striking zones on the earth’s surface for two conceptual thundercloud charge configurations: titled dipole and conventional tripole structure with excessive lower positive charge regions that lead to producing +CG lightning. The electric potential and surface charge density along the earth’s surface for both structures via continuously adjusting the position and the charge density of their charge regions is investigated. Simulation results for tilted dipole structure confirm the down-shear extension of the upper positive charge region in the direction of the cloud’s forward flank by 4 to 8 km, resulting in negative surface density, and would expect +CG lightning to strike within 7.8 km to 20 km around the earth periphery in the direction of the cloud’s forward flank. On the other hand, the conceptual tripole charge structure with enhanced lower positive charge region develops negative surface charge density on the earth’s surface in the range |x| < 6.5 km beneath the thundercloud and highly favors producing +CG lightning strikes.

Keywords: pyrocumulonimbus, cloud-to-ground lightning, charge structure, surface charge density, forward flank

Procedia PDF Downloads 113
665 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks

Authors: Ather Saeed, Arif Khan, Jeffrey Gosper

Abstract:

Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.

Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering

Procedia PDF Downloads 77
664 Incomplete Existing Algebra to Support Mathematical Computations

Authors: Ranjit Biswas

Abstract:

The existing subject Algebra is incomplete to support mathematical computations being done by scientists of all areas: Mathematics, Physics, Statistics, Chemistry, Space Science, Cosmology etc. even starting from the era of great Einstein. A huge hidden gap in the subject ‘Algebra’ is unearthed. All the scientists today, including mathematicians, physicists, chemists, statisticians, cosmologists, space scientists, and economists, even starting from the great Einstein, are lucky that they got results without facing any contradictions or without facing computational errors. Most surprising is that the results of all scientists, including Nobel Prize winners, were proved by them by doing experiments too. But in this paper, it is rigorously justified that they all are lucky. An algebraist can define an infinite number of new algebraic structures. The objective of the work in this paper is not just for the sake of defining a distinct algebraic structure, but to recognize and identify a major gap of the subject ‘Algebra’ lying hidden so far in the existing vast literature of it. The objective of this work is to fix the unearthed gap. Consequently, a different algebraic structure called ‘Region’ has been introduced, and its properties are studied.

Keywords: region, ROR, RORR, region algebra

Procedia PDF Downloads 54
663 A Peg Board with Photo-Reflectors to Detect Peg Insertion and Pull-Out Moments

Authors: Hiroshi Kinoshita, Yasuto Nakanishi, Ryuhei Okuno, Toshio Higashi

Abstract:

Various kinds of pegboards have been developed and used widely in research and clinics of rehabilitation for evaluation and training of patient’s hand function. A common measure in these peg boards is a total time of performance execution assessed by a tester’s stopwatch. Introduction of electrical and automatic measurement technology to the apparatus, on the other hand, has been delayed. The present work introduces the development of a pegboard with an electric sensor to detect moments of individual peg’s insertion and removal. The work also gives fundamental data obtained from a group of healthy young individuals who performed peg transfer tasks using the pegboard developed. Through trails and errors in pilot tests, two 10-hole peg-board boxes installed with a small photo-reflector and a DC amplifier at the bottom of each hole were designed and built by the present authors. The amplified electric analogue signals from the 20 reflectors were automatically digitized at 500 Hz per channel, and stored in a PC. The boxes were set on a test table at different distances (25, 50, 75, and 125 mm) in parallel to examine the effect of hole-to-hole distance. Fifty healthy young volunteers (25 in each gender) as subjects of the study performed successive fast 80 time peg transfers at each distance using their dominant and non-dominant hands. The data gathered showed a clear-cut light interruption/continuation moment by the pegs, allowing accurately (no tester’s error involved) and precisely (an order of milliseconds) to determine the pull out and insertion times of each peg. This further permitted computation of individual peg movement duration (PMD: from peg-lift-off to insertion) apart from hand reaching duration (HRD: from peg insertion to lift-off). An accidental drop of a peg led to an exceptionally long ( < mean + 3 SD) PMD, which was readily detected from an examination of data distribution. The PMD data were commonly right-skewed, suggesting that the median can be a better estimate of individual PMD than the mean. Repeated measures ANOVA using the median values revealed significant hole-to-hole distance, and hand dominance effects, suggesting that these need to be fixed in the accurate evaluation of PMD. The gender effect was non-significant. Performance consistency was also evaluated by the use of quartile variation coefficient values, which revealed no gender, hole-to-hole, and hand dominance effects. The measurement reliability was further examined using interclass correlation obtained from 14 subjects who performed the 25 and 125 mm hole distance tasks at two 7-10 days separate test sessions. Inter-class correlation values between the two tests showed fair reliability for PMD (0.65-0.75), and for HRD (0.77-0.94). We concluded that a sensor peg board developed in the present study could provide accurate (excluding tester’s errors), and precise (at a millisecond rate) time information of peg movement separated from that used for hand movement. It could also easily detect and automatically exclude erroneous execution data from his/her standard data. These would lead to a better evaluation of hand dexterity function compared to the widely used conventional used peg boards.

Keywords: hand, dexterity test, peg movement time, performance consistency

Procedia PDF Downloads 134
662 Variations of the Modal Characteristics of the Feeding Stage with Different Preloaded Linear Guide

Authors: Jui-Pui Hung, Yong-Run Chen, Wei-Cheng Shih, Chun-Wei Lin

Abstract:

This study was aimed to assess the variations of the modal characteristics of the feeding stage with different linear guide modulus. The dynamic characteristics of the feeding stage were characterized in terms of the modal stiffness, modal frequency and modal damping, which are assessed from the vibration tests. According to the experimental measurements, the actual preload of the linear guide modulus was found to deviate from the rated values as setting in factory. This may be due to the assemblage errors of guide modules. For the stage with linear guides, the dynamic stiffness was affected to change by the preload set on the rolling balls. The variation of the dynamic stiffness at first and second modes is 20.8 and 10.5%, respectively when the linear guide preload is adjusted from medium and high amount. But the modal damping ratio is reduced by 8.97 and 9.65%, respectively. For high-frequency mode, the modal stiffness increases by 171.2% and the damping ratio reduced by 34.4%. Current results demonstrate the importance in the determining the preloaded amount of linear guide modulus in practical application.

Keywords: contact stiffness, feeding stage, linear guides, modal characteristics, pre-load

Procedia PDF Downloads 430
661 Enhance Security in XML Databases: XLog File for Severity-Aware Trust-Based Access Control

Authors: A: Asmawi, L. S. Affendey, N. I. Udzir, R. Mahmod

Abstract:

The topic of enhancing security in XML databases is important as it includes protecting sensitive data and providing a secure environment to users. In order to improve security and provide dynamic access control for XML databases, we presented XLog file to calculate user trust values by recording users’ bad transaction, errors and query severities. Severity-aware trust-based access control for XML databases manages the access policy depending on users' trust values and prevents unauthorized processes, malicious transactions and insider threats. Privileges are automatically modified and adjusted over time depending on user behaviour and query severity. Logging in database is an important process and is used for recovery and security purposes. In this paper, the Xlog file is presented as a dynamic and temporary log file for XML databases to enhance the level of security.

Keywords: XML database, trust-based access control, severity-aware, trust values, log file

Procedia PDF Downloads 300
660 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements

Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori

Abstract:

The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.

Keywords: apportionment, bias, divisor, fair, measurement

Procedia PDF Downloads 366
659 Using Optimal Control Method to Investigate the Stability and Transparency of a Nonlinear Teleoperation System with Time Varying Delay

Authors: Abasali Amini, Alireza Mirbagheri, Amir Homayoun Jafari

Abstract:

In this paper, a new structure for teleoperation systems with time varying delay has been modeled and proposed. A random time varying the delay of up to 150 msec is simulated in teleoperation channel of both masters to slave and vice versa. The system stability and transparency have been investigated, comparing the result of a PID controller and an optimal controller on each master and slave sub-systems separately. The controllers have been designed in slave subsystem for reducing position errors between master and slave, and another controller has been designed in the master subsystem to establish stability, transparency and force tracking. Results have been compared together. The results showed PID controller is appropriate in position tracking, but force response oscillates in contact with the environment. We showed the optimal control established position tracking properly. Also, force tracking is achieved in this controller appropriately.

Keywords: optimal control, time varying delay, teleoperation systems, stability and transparency

Procedia PDF Downloads 257
658 SARS-CoV-2 Transmission Risk Factors among Patients from a Metropolitan Community Health Center, Puerto Rico, July 2020 to March 2022

Authors: Juan C. Reyes, Linnette Rodríguez, Héctor Villanueva, Jorge Vázquez, Ivonne Rivera

Abstract:

On July 2020, a private non-profit community health center (HealthProMed) that serves people without a medical insurance plan or with limited resources in one of the most populated areas in San Juan, Puerto Rico, implemented a COVID-19 case investigation and contact-tracing surveillance system. Nursing personnel at the health center completed a computerized case investigation form that was translated, adapted, and modified from CDC’s Patient Under Investigation (PUI) Form. Between July 13, 2020, and March 17, 2022, a total of 9,233 SARS-CoV-2 tests were conducted at the health center, 16.9% of which were classified as confirmed cases (positive molecular test) and 27.7% as probable cases (positive serologic test). Most of the confirmed cases were females (60.0%), under 20 years old (29.1%), and living in their homes (59.1%). In the 14 days before the onset of symptoms, 26.3% of confirmed cases reported going to the supermarket, 22.4% had contact with a known COVID-19 case, and 20.7% went to work. The symptoms most commonly reported were sore throat (33.4%), runny nose (33.3%), cough (24.9%), and headache (23.2%). The most common preexisting medical conditions among confirmed cases were hypertension (19.3%), chronic lung disease including asthma, emphysema, COPD (13.3%), and diabetes mellitus (12.8). Multiple logistic regression analysis revealed that patients who used alcohol frequently during the last two weeks (OR=1.43; 95%CI: 1.15-1.77), those who were in contact with a positive case (OR=1.58; 95%CI: 1.33-1.88) and those who were obese (OR=1.82; 95%CI: 1.24-2.69) were significantly more likely to be a confirmed case after controlling for sociodemographic variables. Implementing a case investigation and contact-tracing component at community health centers can be of great value in the prevention and control of COVID-19 at the community level and could be used in future outbreaks.

Keywords: community health center, Puerto Rico, risk factors, SARS-CoV-2

Procedia PDF Downloads 116
657 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning

Authors: Yanwen Li, Shuguo Xie

Abstract:

In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.

Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning

Procedia PDF Downloads 267
656 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: client/customer, problem statement, requirements engineering, software developers

Procedia PDF Downloads 408
655 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection

Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón

Abstract:

Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.

Keywords: aerial thermography, data processing, drone, low-cost, point cloud

Procedia PDF Downloads 145