Search results for: the probability integral transform
2795 Modelling and Control of Electrohydraulic System Using Fuzzy Logic Algorithm
Authors: Hajara Abdulkarim Aliyu, Abdulbasid Ismail Isa
Abstract:
This research paper studies electrohydraulic system for its role in position and motion control system and develops as mathematical model describing the behaviour of the system. The research further proposes Fuzzy logic and conventional PID controllers in order to achieve both accurate positioning of the payload and overall improvement of the system performance. The simulation result shows Fuzzy logic controller has a superior tracking performance and high disturbance rejection efficiency for its shorter settling time, less overshoot, smaller values of integral of absolute and deviation errors over the conventional PID controller at all the testing conditions.Keywords: electrohydraulic, fuzzy logic, modelling, NZ-PID
Procedia PDF Downloads 4692794 Derivatives Formulas Involving I-Functions of Two Variables and Generalized M-Series
Authors: Gebreegziabher Hailu Gebrecherkos
Abstract:
This study explores the derivatives of functions defined by I-functions of two variables and their connections to generalized M-series. We begin by defining I-functions, which are generalized functions that encompass various special functions, and analyze their properties. By employing advanced calculus techniques, we derive new formulas for the first and higher-order derivatives of I-functions with respect to their variables; we establish some derivative formulae of the I-function of two variables involving generalized M-series. The special cases of our derivatives yield interesting results.Keywords: I-function, Mellin-Barners control integral, generalized M-series, higher order derivative
Procedia PDF Downloads 142793 A Generalisation of Pearson's Curve System and Explicit Representation of the Associated Density Function
Authors: S. B. Provost, Hossein Zareamoghaddam
Abstract:
A univariate density approximation technique whereby the derivative of the logarithm of a density function is assumed to be expressible as a rational function is introduced. This approach which extends Pearson’s curve system is solely based on the moments of a distribution up to a determinable order. Upon solving a system of linear equations, the coefficients of the polynomial ratio can readily be identified. An explicit solution to the integral representation of the resulting density approximant is then obtained. It will be explained that when utilised in conjunction with sample moments, this methodology lends itself to the modelling of ‘big data’. Applications to sets of univariate and bivariate observations will be presented.Keywords: density estimation, log-density, moments, Pearson's curve system
Procedia PDF Downloads 2762792 Crude Oil and Stocks Markets: Prices and Uncertainty Transmission Analysis
Authors: Kamel Malik Bensafta, Gervasio Semedo
Abstract:
The purpose of this paper is to investigate the relationship between oil prices and socks markets. The empirical analysis in this paper is conducted within the context of Multivariate GARCH models, using a transform version of the so-called BEKK parameterization. We show that mean and uncertainty of US market are transmitted to oil market and European market. We also identify an important transmission from WTI prices to Brent Prices.Keywords: oil volatility, stock markets, MGARCH, transmission, structural break
Procedia PDF Downloads 5212791 Efficient Modeling Technique for Microstrip Discontinuities
Authors: Nassim Ourabia, Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The technique obtains closed form expressions for the equivalent circuits which are used to model these discontinuities. Then it would be easy to handle and to characterize complicated structures like T and Y junctions, truncated junctions, arbitrarily shaped junctions, cascading junctions, and more generally planar multiport junctions. Another advantage of this method is that the edge line concept for arbitrary shape junctions operates with real parameters circuits. The validity of the method was further confirmed by comparing our results for various discontinuities (bend, filters) with those from HFSS as well as from other published sources.Keywords: CAD analysis, contour integral approach, microwave circuits, s-parameters
Procedia PDF Downloads 5142790 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making
Procedia PDF Downloads 742789 Implementation of Statistical Parameters to Form an Entropic Mathematical Models
Authors: Gurcharan Singh Buttar
Abstract:
It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency
Procedia PDF Downloads 1552788 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 1892787 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks
Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev
Abstract:
One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.Keywords: channel estimation, inter-cell interference, pilot contamination attacks, wireless communications
Procedia PDF Downloads 2142786 How Autonomous Vehicles Transform Urban Policies and Cities
Authors: Adrián P. Gómez Mañas
Abstract:
Autonomous vehicles have already transformed urban policies and cities. This is the main assumption of our research, which aims to understand how the representations of the possible arrival of autonomous vehicles already transform priorities or actions in transport and more largely, urban policies. This research is done within the framework of a Ph.D. doctorate directed by Professor Xavier Desjardins at the Sorbonne University of Paris. Our hypotheses are: (i) the perspectives, representations, and imaginaries on autonomous vehicles already affect the stakeholders of urban policies; (ii) the discourses on the opportunities or threats of autonomous vehicles reflect the current strategies of the stakeholders. Each stakeholder tries to integrate a discourse on autonomous vehicles that allows them to change as little as possible their current tactics and strategies. The objective is to eventually make a comparison between three different cases: Paris, United Arab Emirates, and Bogota. We chose those territories because their contexts are very different, but they all have important interests in mobility and innovation, and they all have started to reflect on the subject of self-driving mobility. The main methodology used is to interview actors of the metropolitan area (local officials, leading urban and transport planners, influent experts, and private companies). This work is supplemented with conferences, official documents, press articles, and websites. The objective is to understand: 1) What they know about autonomous vehicles and where does their knowledge come from; 2) What they expect from autonomous vehicles; 3) How their ideas about autonomous vehicles are transforming their action and strategy in managing daily mobility, investing in transport, designing public spaces and urban planning. We are going to present the research and some preliminary results; we will show that autonomous vehicles are often viewed by public authorities as a lever to reach something else. We will also present that speeches are very influenced by local context (political, geographical, economic, etc.), creating an interesting balance between global and local influences. We will analyze the differences and similarities between the three cases and will try to understand which are the causes.Keywords: autonomous vehicles, self-driving mobility, urban planning, urban mobility, transport, public policies
Procedia PDF Downloads 1972785 Comparison of Receiver Operating Characteristic Curve Smoothing Methods
Authors: D. Sigirli
Abstract:
The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve
Procedia PDF Downloads 1512784 Determining Optimum Locations for Runoff Water Harvesting in W. Watir, South Sinai, Using RS, GIS, and WMS Techniques
Authors: H. H. Elewa, E. M. Ramadan, A. M. Nosair
Abstract:
Rainfall water harvesting is considered as an important tool for overcoming water scarcity in arid and semi-arid region. Wadi Watir in the southeastern part of Sinai Peninsula is considered as one of the main and active basins in the Gulf of Aqaba drainage system. It is characterized by steep hills mainly consist of impermeable rocks, whereas the streambeds are covered by a highly permeable mixture of gravel and sand. A comprehensive approach involving the integration of geographic information systems, remote sensing and watershed modeling was followed to identify the RWH capability in this area. Eight thematic layers, viz volume of annual flood, overland flow distance, maximum flow distance, rock or soil infiltration, drainage frequency density, basin area, basin slope and basin length were used as a multi-parametric decision support system for conducting weighted spatial probability models (WSPMs) to determine the potential areas for the RWH. The WSPMs maps classified the area into five RWH potentiality classes ranging from the very low to very high. Three performed WSPMs' scenarios for W. Watir reflected identical results among their maps for the high and very high RWH potentiality classes, which are the most suitable ones for conducting surface water harvesting techniques. There is also a reasonable match with respect to the potentiality of runoff harvesting areas with a probability of moderate, low and very low among the three scenarios. WSPM results have shown that the high and very high classes, which are the most suitable for the RWH are representing approximately 40.23% of the total area of the basin. Accordingly, several locations were decided for the establishment of water harvesting dams and cisterns to improve the water conditions and living environment in the study area.Keywords: Sinai, Wadi Watir, remote sensing, geographic information systems, watershed modeling, runoff water harvesting
Procedia PDF Downloads 3562783 Healthcare Utilization and Costs of Specific Obesity Related Health Conditions in Alberta, Canada
Authors: Sonia Butalia, Huong Luu, Alexis Guigue, Karen J. B. Martins, Khanh Vu, Scott W. Klarenbach
Abstract:
Obesity-related health conditions impose a substantial economic burden on payers due to increased healthcare use. Estimates of healthcare resource use and costs associated with obesity-related comorbidities are needed to inform policies and interventions targeting these conditions. Methods: Adults living with obesity were identified (a procedure-related body mass index code for class 2/3 obesity between 2012 and 2019 in Alberta, Canada; excluding those with bariatric surgery), and outcomes were compared over 1-year (2019/2020) between those who had and did not have specific obesity-related comorbidities. The probability of using a healthcare service (based on the odds ratio of a zero [OR-zero] cost) was compared; 95% confidence intervals (CI) were reported. Logistic regression and a generalized linear model with log link and gamma distribution were used for total healthcare cost comparisons ($CDN); cost ratios and estimated cost differences (95% CI) were reported. Potential socio-demographic and clinical confounders were adjusted for, and incremental cost differences were representative of a referent case. Results: A total of 220,190 adults living with obesity were included; 44% had hypertension, 25% had osteoarthritis, 24% had type-2 diabetes, 17% had cardiovascular disease, 12% had insulin resistance, 9% had chronic back pain, and 4% of females had polycystic ovarian syndrome (PCOS). The probability of hospitalization, ED visit, and ambulatory care was higher in those with a following obesity-related comorbidity versus those without: chronic back pain (hospitalization: 1.8-times [OR-zero: 0.57 [0.55/0.59]] / ED visit: 1.9-times [OR-zero: 0.54 [0.53/0.56]] / ambulatory care visit: 2.4-times [OR-zero: 0.41 [0.40/0.43]]), cardiovascular disease (2.7-times [OR-zero: 0.37 [0.36/0.38]] / 1.9-times [OR-zero: 0.52 [0.51/0.53]] / 2.8-times [OR-zero: 0.36 [0.35/0.36]]), osteoarthritis (2.0-times [OR-zero: 0.51 [0.50/0.53]] / 1.4-times [OR-zero: 0.74 [0.73/0.76]] / 2.5-times [OR-zero: 0.40 [0.40/0.41]]), type-2 diabetes (1.9-times [OR-zero: 0.54 [0.52/0.55]] / 1.4-times [OR-zero: 0.72 [0.70/0.73]] / 2.1-times [OR-zero: 0.47 [0.46/0.47]]), hypertension (1.8-times [OR-zero: 0.56 [0.54/0.57]] / 1.3-times [OR-zero: 0.79 [0.77/0.80]] / 2.2-times [OR-zero: 0.46 [0.45/0.47]]), PCOS (not significant / 1.2-times [OR-zero: 0.83 [0.79/0.88]] / not significant), and insulin resistance (1.1-times [OR-zero: 0.88 [0.84/0.91]] / 1.1-times [OR-zero: 0.92 [0.89/0.94]] / 1.8-times [OR-zero: 0.56 [0.54/0.57]]). After fully adjusting for potential confounders, the total healthcare cost ratio was higher in those with a following obesity-related comorbidity versus those without: chronic back pain (1.54-times [1.51/1.56]), cardiovascular disease (1.45-times [1.43/1.47]), osteoarthritis (1.36-times [1.35/1.38]), type-2 diabetes (1.30-times [1.28/1.31]), hypertension (1.27-times [1.26/1.28]), PCOS (1.08-times [1.05/1.11]), and insulin resistance (1.03-times [1.01/1.04]). Conclusions: Adults with obesity who have specific disease-related health conditions have a higher probability of healthcare use and incur greater costs than those without specific comorbidities; incremental costs are larger when other obesity-related health conditions are not adjusted for. In a specific referent case, hypertension was costliest (44% had this condition with an additional annual cost of $715 [$678/$753]). If these findings hold for the Canadian population, hypertension in persons with obesity represents an estimated additional annual healthcare cost of $2.5 billion among adults living with obesity (based on an adult obesity rate of 26%). Results of this study can inform decision making on investment in interventions that are effective in treating obesity and its complications.Keywords: administrative data, healthcare cost, obesity-related comorbidities, real world evidence
Procedia PDF Downloads 1472782 A Quadcopter Stability Analysis: A Case Study Using Simulation
Authors: C. S. Bianca Sabrina, N. Egidio Raimundo, L. Alexandre Baratella, C. H. João Paulo
Abstract:
This paper aims to present a study, with the theoretical concepts and applications of the Quadcopter, using the MATLAB simulator. In order to use this tool, the study of the stability of the drone through a Proportional - Integral - Derivative (PID) controller will be presented. After the stability study, some tests are done on the simulator and its results will be presented. From the mathematical model, it is possible to find the Newton-Euler angles, so that it is possible to stabilize the quadcopter in a certain position in the air, starting from the ground. In order to understand the impact of the controllers gain values on the stabilization of the Euler-Newton angles, three conditions will be tested with different controller gain values.Keywords: controllers, drones, quadcopter, stability
Procedia PDF Downloads 1972781 Innovate, Educate, and Transform, Tailoring Sustainable Waste Handling Solutions for Nepal’s Small Populated Municipalities: Insights From Chandragiri Municipality
Authors: Anil Kumar Baral
Abstract:
The research introduces a ground-breaking approach to waste management, emphasizing innovation, education, and transformation. Using Chandragiri Municipality as a case study, the study advocates a shift from traditional to progressive waste management strategies, contributing an inventive waste framework, sustainability advocacy, and a transformative blueprint. The waste composition analysis highlights Chandragiri's representative profile, leading to a comprehensive plan addressing challenges and recommending a transition to a profitable waste treatment model, supported by relevant statistics. The data-driven approach incorporates the official data of waste Composition from Chandragiri Municipality as secondary data and incorporates the primary data from Chandragiri households, ensuring a nuanced perspective. Discussions on implementation, viability, and environmental preservation underscore the dual benefit of sustainability. The study includes a comparative analysis, monitoring, and evaluation framework, examining international relevance and collaboration, and conducting a social and environmental impact assessment. The results indicate the necessity for creative changes in Chandragiri's waste practices, recommending separate treatment centers in wards level rather than Municipal level, composting machines, and a centralized waste treatment plant. Educational reforms involve revising school curricula and awareness campaigns. The transformation's success hinges on reducing waste size, efficient treatment center operation, and ongoing public literacy. The conclusion summarizes key findings, envisioning a future with sustainable waste management practices deeply embedded in the community fabric.Keywords: innovate, educate, transform, municipality, method
Procedia PDF Downloads 452780 Internal Migration and Poverty Dynamic Analysis Using a Bayesian Approach: The Tunisian Case
Authors: Amal Jmaii, Damien Rousseliere, Besma Belhadj
Abstract:
We explore the relationship between internal migration and poverty in Tunisia. We present a methodology combining potential outcomes approach with multiple imputation to highlight the effect of internal migration on poverty states. We find that probability of being poor decreases when leaving the poorest regions (the west areas) to the richer regions (greater Tunis and the east regions).Keywords: internal migration, potential outcomes approach, poverty dynamics, Tunisia
Procedia PDF Downloads 3082779 Environment and Health Quality in Urban Slums of Chandigarh: A Case Study
Authors: Ritu Sarsoha
Abstract:
According to World Summit 2002 health is an integral component of sustainable development. Due to overpopulation and lack of employment opportunities in villages and small towns, the rural youth tend to migrate to the big cities causing mushrooming of slums. These slums lack most of the basic necessities of life particularly regarding environmental pollution and appropriate health care system. Present paper deals with the socio-economic and environmental status of people living in slum area of Chandigarh which has now grown as a big city today as it has become a hub for the migrants from U. P. and Bihar. Here is a case study of Colony no. 5 of Chandigarh which is divided into more than one block.Keywords: slum, socio-economic, environment pollution, health
Procedia PDF Downloads 3042778 Neural Network and Support Vector Machine for Prediction of Foot Disorders Based on Foot Analysis
Authors: Monireh Ahmadi Bani, Adel Khorramrouz, Lalenoor Morvarid, Bagheri Mahtab
Abstract:
Background:- Foot disorders are common in musculoskeletal problems. Plantar pressure distribution measurement is one the most important part of foot disorders diagnosis for quantitative analysis. However, the association of plantar pressure and foot disorders is not clear. With the growth of dataset and machine learning methods, the relationship between foot disorders and plantar pressures can be detected. Significance of the study:- The purpose of this study was to predict the probability of common foot disorders based on peak plantar pressure distribution and center of pressure during walking. Methodologies:- 2323 participants were assessed in a foot therapy clinic between 2015 and 2021. Foot disorders were diagnosed by an experienced physician and then they were asked to walk on a force plate scanner. After the data preprocessing, due to the difference in walking time and foot size, we normalized the samples based on time and foot size. Some of force plate variables were selected as input to a deep neural network (DNN), and the probability of any each foot disorder was measured. In next step, we used support vector machine (SVM) and run dataset for each foot disorder (classification of yes or no). We compared DNN and SVM for foot disorders prediction based on plantar pressure distributions and center of pressure. Findings:- The results demonstrated that the accuracy of deep learning architecture is sufficient for most clinical and research applications in the study population. In addition, the SVM approach has more accuracy for predictions, enabling applications for foot disorders diagnosis. The detection accuracy was 71% by the deep learning algorithm and 78% by the SVM algorithm. Moreover, when we worked with peak plantar pressure distribution, it was more accurate than center of pressure dataset. Conclusion:- Both algorithms- deep learning and SVM will help therapist and patients to improve the data pool and enhance foot disorders prediction with less expense and error after removing some restrictions properly.Keywords: deep neural network, foot disorder, plantar pressure, support vector machine
Procedia PDF Downloads 3502777 Modern Era Applications of Mathematics and Computer Science
Authors: Ogunrinde Roseline Bosede, Ogunrinde Rowland Rotimi
Abstract:
Just as the development of ideas of early mathematics was essentially motivated by social needs, the invention of the computer was equally inspired by social needs. The early years of the twenty-first century have been remarkable in advances in mathematical and computer sciences. Mathematical and computer sciences work are fast becoming an increasingly integral and essential components of a growing catalogues of areas of interests in biology, business, military, medicine, social sciences, advanced design, advanced materials, climate, banking and finance, and many other fields of disciplines. This paper seeks to highlight the trend and impacts of the duo in the technological advancements being witnessed in our today's world.Keywords: computer, impacts, mathematics, modern society
Procedia PDF Downloads 3992776 A Novel Approach of Secret Communication Using Douglas-Peucker Algorithm
Authors: R. Kiruthika, A. Kannan
Abstract:
Steganography is the problem of hiding secret messages in 'innocent – looking' public communication so that the presence of the secret message cannot be detected. This paper introduces a steganographic security in terms of computational in-distinguishability from a channel of probability distributions on cover messages. This method first splits the cover image into two separate blocks using Douglas – Peucker algorithm. The text message and the image will be hided in the Least Significant Bit (LSB) of the cover image.Keywords: steganography, lsb, embedding, Douglas-Peucker algorithm
Procedia PDF Downloads 3632775 Water Access and Food Security: A Cross-Sectional Study of SSA Countries in 2017
Authors: Davod Ahmadi, Narges Ebadi, Ethan Wang, Hugo Melgar-Quiñonez
Abstract:
Compared to the other Least Developed Countries (LDCs), major countries in sub-Saharan Africa (SSA) have limited access to the clean water. People in this region, and more specifically females, suffer from acute water scarcity problems. They are compelled to spend too much of their time bringing water for domestic use like drinking and washing. Apart from domestic use, water through affecting agriculture and livestock contributes to the food security status of people in vulnerable regions like SSA. Livestock needs water to grow, and agriculture requires enormous quantities of water for irrigation. The main objective of this study is to explore the association between access to water and individuals’ food security status. Data from 2017 Gallup World Poll (GWP) for SSA were analyzed (n=35,000). The target population in GWP is the entire civilian, non-institutionalized, aged 15 and older population. All samples selection is probability based and nationally representative. The Gallup surveys an average of 1,000 samples of individuals per country. Three questions related to water (i.e., water quality, availability of water for crops and availability of water for livestock) were used as the exposure variables. Food Insecurity Experience Scale (FIES) was used as the outcome variable. FIES measures individuals’ food security status, and it is composed of eight questions with simple dichotomous responses (1=Yes and 0=No). Different statistical analyses such as descriptive, crosstabs and binary logistic regression, form the basis of this study. Results from descriptive analyses showed that more than 50% of the respondents had no access to enough water for crops and livestock. More than 85% of respondents were categorized as “food insecure”. Findings from cross-tabulation analyses showed that food security status was significantly associated with water quality (0.135; P=0.000), water for crops (0.106; P=0.000) and water for livestock (0.112; P=0.000). In regression analyses, the probability of being food insecure increased among people who expressed no satisfaction with water quality (OR=1.884 (OR=1.768-2.008)), not enough water for crops (OR=1.721 (1.616-1.834)) and not enough water for livestock (OR=1.706 (1.819)). In conclusion, it should note that water access affects food security status in SSA.Keywords: water access, agriculture, livestock, FIES
Procedia PDF Downloads 1492774 Future Trends of Mechatronics Engineering in Pakistan
Authors: Aqeela Mir, Akhtar Nawaz Malik, Javaid Iqbal
Abstract:
The paper presents a survey based approach in order to observe the level of awareness regarding Mechatronics in society of Pakistan and the factors affecting the future development trend of Mechatronics in Pakistan. With the help of these surveys a new direction for making a Mathematical model for the future development trend of Mechatronics in Pakistan is also suggested.Keywords: mechatronics society survey, future development trend of mechatronics in pakistan, probability estimation, mathematical model
Procedia PDF Downloads 5192773 Matlab Method for Exclusive-or Nodes in Fuzzy GERT Networks
Authors: Roland Lachmayer, Mahtab Afsari
Abstract:
Research is the cornerstone for advancement of human communities. So that it is one of the indexes for evaluating advancement of countries. Research projects are usually cost and time-consuming and do not end in result in short term. Project scheduling is one of the integral parts of project management. The present article offers a new method by using C# and Matlab software to solve Fuzzy GERT networks for Exclusive-OR kind of nodes to schedule the network. In this article we concentrate on flowcharts that we used in Matlab to show how we apply Matlab to schedule Exclusive-OR nodes.Keywords: research projects, fuzzy GERT, fuzzy CPM, CPM, α-cuts, scheduling
Procedia PDF Downloads 3972772 Conditions on Expressing a Matrix as a Sum of α-Involutions
Authors: Ric Joseph R. Murillo, Edna N. Gueco, Dennis I. Merino
Abstract:
Let F be C or R, where C and R are the set of complex numbers and real numbers, respectively, and n be a natural number. An n-by-n matrix A over the field F is called an α-involutory matrix or an α-involution if there exists an α in the field such that the square of the matrix is equal to αI, where I is the n-by-n identity matrix. If α is a complex number or a nonnegative real number, then an n-by-n matrix A over the field F can be written as a sum of n-by-n α-involutory matrices over the field F if and only if the trace of that matrix is an integral multiple of the square root of α. Meanwhile, if α is a negative real number, then a 2n-by-2n matrix A over R can be written as a sum of 2n-by-2n α-involutory matrices over R if and only the trace of the matrix is zero. Some other properties of α-involutory matrices are also determinedKeywords: α-involutory Matrices, sum of α-involutory Matrices, Trace, Matrix Theory
Procedia PDF Downloads 1972771 Digital Development of Cultural Heritage: Construction of Traditional Chinese Pattern Database
Authors: Shaojian Li
Abstract:
The traditional Chinese patterns, as an integral part of Chinese culture, possess unique values in history, culture, and art. However, with the passage of time and societal changes, many of these traditional patterns are at risk of being lost, damaged, or forgotten. To undertake the digital preservation and protection of these traditional patterns, this paper will collect and organize images of traditional Chinese patterns. It will provide exhaustive and comprehensive semantic annotations, creating a resource library of traditional Chinese pattern images. This will support the digital preservation and application of traditional Chinese patterns.Keywords: digitization of cultural heritage, traditional Chinese patterns, digital humanities, database construction
Procedia PDF Downloads 582770 Optimum Dimensions of Hydraulic Structures Foundation and Protections Using Coupled Genetic Algorithm with Artificial Neural Network Model
Authors: Dheyaa W. Abbood, Rafa H. AL-Suhaili, May S. Saleh
Abstract:
A model using the artificial neural networks and genetic algorithm technique is developed for obtaining optimum dimensions of the foundation length and protections of small hydraulic structures. The procedure involves optimizing an objective function comprising a weighted summation of the state variables. The decision variables considered in the optimization are the upstream and downstream cutoffs length sand their angles of inclination, the foundation length, and the length of the downstream soil protection. These were obtained for a given maximum difference in head, depth of impervious layer and degree of anisotropy.The optimization carried out subjected to constraints that ensure a safe structure against the uplift pressure force and sufficient protection length at the downstream side of the structure to overcome an excessive exit gradient. The Geo-studios oft ware, was used to analyze 1200 different cases. For each case the length of protection and volume of structure required to satisfy the safety factors mentioned previously were estimated. An ANN model was developed and verified using these cases input-output sets as its data base. A MatLAB code was written to perform a genetic algorithm optimization modeling coupled with this ANN model using a formulated optimization model. A sensitivity analysis was done for selecting the cross-over probability, the mutation probability and level ,the number of population, the position of the crossover and the weights distribution for all the terms of the objective function. Results indicate that the most factor that affects the optimum solution is the number of population required. The minimum value that gives stable global optimum solution of this parameters is (30000) while other variables have little effect on the optimum solution.Keywords: inclined cutoff, optimization, genetic algorithm, artificial neural networks, geo-studio, uplift pressure, exit gradient, factor of safety
Procedia PDF Downloads 3232769 Model Reference Adaptive Approach for Power System Stabilizer for Damping of Power Oscillations
Authors: Jožef Ritonja, Bojan Grčar, Boštjan Polajžer
Abstract:
In recent years, electricity trade between neighboring countries has become increasingly intense. Increasing power transmission over long distances has resulted in an increase in the oscillations of the transmitted power. The damping of the oscillations can be carried out with the reconfiguration of the network or the replacement of generators, but such solution is not economically reasonable. The only cost-effective solution to improve the damping of power oscillations is to use power system stabilizers. Power system stabilizer represents a part of synchronous generator control system. It utilizes semiconductor’s excitation system connected to the rotor field excitation winding to increase the damping of the power system. The majority of the synchronous generators are equipped with the conventional power system stabilizers with fixed parameters. The control structure of the conventional power system stabilizers and the tuning procedure are based on the linear control theory. Conventional power system stabilizers are simple to realize, but they show non-sufficient damping improvement in the entire operating conditions. This is the reason that advanced control theories are used for development of better power system stabilizers. In this paper, the adaptive control theory for power system stabilizers design and synthesis is studied. The presented work is focused on the use of model reference adaptive control approach. Control signal, which assures that the controlled plant output will follow the reference model output, is generated by the adaptive algorithm. Adaptive gains are obtained as a combination of the "proportional" term and with the σ-term extended "integral" term. The σ-term is introduced to avoid divergence of the integral gains. The necessary condition for asymptotic tracking is derived by means of hyperstability theory. The benefits of the proposed model reference adaptive power system stabilizer were evaluated as objectively as possible by means of a theoretical analysis, numerical simulations and laboratory realizations. Damping of the synchronous generator oscillations in the entire operating range was investigated. Obtained results show the improved damping in the entire operating area and the increase of the power system stability. The results of the presented work will help by the development of the model reference power system stabilizer which should be able to replace the conventional stabilizers in power systems.Keywords: power system, stability, oscillations, power system stabilizer, model reference adaptive control
Procedia PDF Downloads 1362768 Generation of Numerical Data for the Facilitation of the Personalized Hyperthermic Treatment of Cancer with An Interstital Antenna Array Using the Method of Symmetrical Components
Authors: Prodromos E. Atlamazoglou
Abstract:
The method of moments combined with the method of symmetrical components is used for the analysis of interstitial hyperthermia applicators. The basis and testing functions are both piecewise sinusoids, qualifying our technique as a Galerkin one. The dielectric coatings are modeled by equivalent volume polarization currents, which are simply related to the conduction current distribution, avoiding in that way the introduction of additional unknowns or numerical integrations. The results of our method for a four dipole circular array, are in agreement with those already published in literature for a same hyperthermia configuration. Apart from being accurate, our approach is more general, more computationally efficient and takes into account the coupling between the antennas.Keywords: hyperthermia, integral equations, insulated antennas, method of symmetrical components
Procedia PDF Downloads 2562767 A Fast Multi-Scale Finite Element Method for Geophysical Resistivity Measurements
Authors: Mostafa Shahriari, Sergio Rojas, David Pardo, Angel Rodriguez- Rozas, Shaaban A. Bakr, Victor M. Calo, Ignacio Muga
Abstract:
Logging-While Drilling (LWD) is a technique to record down-hole logging measurements while drilling the well. Nowadays, LWD devices (e.g., nuclear, sonic, resistivity) are mostly used commercially for geo-steering applications. Modern borehole resistivity tools are able to measure all components of the magnetic field by incorporating tilted coils. The depth of investigation of LWD tools is limited compared to the thickness of the geological layers. Thus, it is a common practice to approximate the Earth’s subsurface with a sequence of 1D models. For a 1D model, we can reduce the dimensionality of the problem using a Hankel transform. We can solve the resulting system of ordinary differential equations (ODEs) either (a) analytically, which results in a so-called semi-analytic method after performing a numerical inverse Hankel transform, or (b) numerically. Semi-analytic methods are used by the industry due to their high performance. However, they have major limitations, namely: -The analytical solution of the aforementioned system of ODEs exists only for piecewise constant resistivity distributions. For arbitrary resistivity distributions, the solution of the system of ODEs is unknown by today’s knowledge. -In geo-steering, we need to solve inverse problems with respect to the inversion variables (e.g., the constant resistivity value of each layer and bed boundary positions) using a gradient-based inversion method. Thus, we need to compute the corresponding derivatives. However, the analytical derivatives of cross-bedded formation and the analytical derivatives with respect to the bed boundary positions have not been published to the best of our knowledge. The main contribution of this work is to overcome the aforementioned limitations of semi-analytic methods by solving each 1D model (associated with each Hankel mode) using an efficient multi-scale finite element method. The main idea is to divide our computations into two parts: (a) offline computations, which are independent of the tool positions and we precompute only once and use them for all logging positions, and (b) online computations, which depend upon the logging position. With the above method, (a) we can consider arbitrary resistivity distributions along the 1D model, and (b) we can easily and rapidly compute the derivatives with respect to any inversion variable at a negligible additional cost by using an adjoint state formulation. Although the proposed method is slower than semi-analytic methods, its computational efficiency is still high. In the presentation, we shall derive the mathematical variational formulation, describe the proposed multi-scale finite element method, and verify the accuracy and efficiency of our method by performing a wide range of numerical experiments and comparing the numerical solutions to semi-analytic ones when the latest are available.Keywords: logging-While-Drilling, resistivity measurements, multi-scale finite elements, Hankel transform
Procedia PDF Downloads 3852766 Implementation of the Recursive Formula for Evaluation of the Strength of Daniels' Bundle
Authors: Vaclav Sadilek, Miroslav Vorechovsky
Abstract:
The paper deals with the classical fiber bundle model of equal load sharing, sometimes referred to as the Daniels' bundle or the democratic bundle. Daniels formulated a multidimensional integral and also a recursive formula for evaluation of the strength cumulative distribution function. This paper describes three algorithms for evaluation of the recursive formula and also their implementations with source codes in high-level programming language Python. A comparison of the algorithms are provided with respect to execution time. Analysis of orders of magnitudes of addends in the recursion is also provided.Keywords: equal load sharing, mpmath, python, strength of Daniels' bundle
Procedia PDF Downloads 402