Search results for: binary%20matrices
363 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains
Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda
Abstract:
In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).Keywords: features extraction, handwritten numeric chains, image processing, neural networks
Procedia PDF Downloads 265362 Factors Affecting Bus Use as a Sustainable Mode of Transportation: Insights from Kerman, Iran
Authors: Fatemeh Rahmani, Navid Nadimi, Vahid Khalifeh
Abstract:
In the near future, cities with medium populations will face traffic congestion, air pollution, high fuel consumption, and noise pollution. It is possible to improve the sustainability of cities by utilizing public transportation. A study of the factors that influence citizens' bus usage in medium-sized cities is presented in this paper. For this purpose, Kerman's citizens were surveyed online. The model was based on a binary logistic regression. A descriptive analysis revealed that simple measures like renewing the fleet, upgrading the stations, establishing a schedule program, and cleaning the buses could improve passenger satisfaction. In addition, the modeling results showed that future traffic congestion can be prevented by implementing road and parking lot pricing plans. Further, as the number and length of trips increases, the probability of citizens taking the bus increases. In conclusion, Kerman's bus system is both secure and fast, but these two characteristics can be improved to increase bus ridership.Keywords: sustainability, transportation, bus, congestion, satisfaction
Procedia PDF Downloads 8361 SNR Classification Using Multiple CNNs
Authors: Thinh Ngo, Paul Rad, Brian Kelley
Abstract:
Noise estimation is essential in today wireless systems for power control, adaptive modulation, interference suppression and quality of service. Deep learning (DL) has already been applied in the physical layer for modulation and signal classifications. Unacceptably low accuracy of less than 50% is found to undermine traditional application of DL classification for SNR prediction. In this paper, we use divide-and-conquer algorithm and classifier fusion method to simplify SNR classification and therefore enhances DL learning and prediction. Specifically, multiple CNNs are used for classification rather than a single CNN. Each CNN performs a binary classification of a single SNR with two labels: less than, greater than or equal. Together, multiple CNNs are combined to effectively classify over a range of SNR values from −20 ≤ SNR ≤ 32 dB.We use pre-trained CNNs to predict SNR over a wide range of joint channel parameters including multiple Doppler shifts (0, 60, 120 Hz), power-delay profiles, and signal-modulation types (QPSK,16QAM,64-QAM). The approach achieves individual SNR prediction accuracy of 92%, composite accuracy of 70% and prediction convergence one order of magnitude faster than that of traditional estimation.Keywords: classification, CNN, deep learning, prediction, SNR
Procedia PDF Downloads 132360 Effect of Vanadium Addition to Aluminum Grain Refined by Ti or Ti + B on Its Microstructure, Mechanical Behavior, Fatigue Strength and Life
Authors: Adnan I. O. Zaid
Abstract:
As aluminum solidifies in columnar structure with large grain size which reduces its surface quality and mechanical strength; therefore it is normally grain refined either by titanium or titanium + boron (Ti or Ti + B). In this paper, the effect of addition of either Ti or Ti + B to commercially pure aluminum on its grain size, Vickers hardness, mechanical strength and fatigue strength and life is presented and discussed. Similarly, the effect of vanadium addition to Al grain refined by Ti or Ti+ B is presented and discussed. Two binary master alloys Al-Ti and Al-Vi were laboratory prepared from which five different micro-alloys in addition to the commercially pure aluminum namely Al-Ti, Al-Ti-B, Al-V, Al-Ti-V and Al-Ti-B-V were prepared for the investigation. Finally, the effect of their addition on the fatigue cracks initiation and propagation, using scanning electron microscope, SEM, is also presented and discussed. Photomirographs and photoscans are included in the paper.Keywords: aluminum, fatigue, grain refinement, titanium, titanium+boron, vanadium
Procedia PDF Downloads 484359 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects
Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa
Abstract:
This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.Keywords: box-counting, digital image processing, fractal dimension, numerical method
Procedia PDF Downloads 81358 Optimal Design of the Power Generation Network in California: Moving towards 100% Renewable Electricity by 2045
Authors: Wennan Long, Yuhao Nie, Yunan Li, Adam Brandt
Abstract:
To fight against climate change, California government issued the Senate Bill No. 100 (SB-100) in 2018 September, which aims at achieving a target of 100% renewable electricity by the end of 2045. A capacity expansion problem is solved in this case study using a binary quadratic programming model. The optimal locations and capacities of the potential renewable power plants (i.e., solar, wind, biomass, geothermal and hydropower), the phase-out schedule of existing fossil-based (nature gas) power plants and the transmission of electricity across the entire network are determined with the minimal total annualized cost measured by net present value (NPV). The results show that the renewable electricity contribution could increase to 85.9% by 2030 and reach 100% by 2035. Fossil-based power plants will be totally phased out around 2035 and solar and wind will finally become the most dominant renewable energy resource in California electricity mix.Keywords: 100% renewable electricity, California, capacity expansion, mixed integer non-linear programming
Procedia PDF Downloads 169357 Queering Alterity: Engaging Pluralism to Move Beyond Gender Binaries in the Classroom
Authors: A. K. O'Loughlin
Abstract:
In Simone de Beauvoir’s climatic 1959 meditation, The Second Sex, she avows that 'On ne naît pas femme; on le devient,' translated most recently in the unabridged text (2010) as 'One is not born, but rather becomes, woman.' The signifier ‘woman’ in this context, signifies Beauvoir’s contemplation of the institution, the concept of woman(ness) defined in relation to the binary and hegemonic man(ness.) She is 'the other.' This paper is a theoretical contemplation of (1) how we actively teach 'othering' in the institution of schooling and (2) new considerations of pluralism for self-reflection and subversion that teachers, in particular, are faced with. How, in schooling, do we learn one’s options for racialized, classed and sexualized gender identification and the hierarchical signification that define these signifiers? Just like the myth of apolitical schooling, we cannot escape teaching social organization in the classroom. Yet, we do have a choice. How do we as educators learn about our own embodied intersectionalities? How do we unlearn our own binaries? How do we teach about intersectional gender? How do we teach 'the other'? We posit the processes of these reflections by educators may move our classrooms beyond binaries, engage pluralism and queer alterity itself.Keywords: othering, alterity, education, schooling, identity, racialization, gender, intersectionality, pluralism
Procedia PDF Downloads 247356 Formation of Chemical Compound Layer at the Interface of Initial Substances A and B with Dominance of Diffusion of the A Atoms
Authors: Pavlo Selyshchev, Samuel Akintunde
Abstract:
A theoretical approach to consider formation of chemical compound layer at the interface between initial substances A and B due to the interfacial interaction and diffusion is developed. It is considered situation when speed of interfacial interaction is large enough and diffusion of A-atoms through AB-layer is much more then diffusion of B-atoms. Atoms from A-layer diffuse toward B-atoms and form AB-atoms on the surface of B-layer. B-atoms are assumed to be immobile. The growth kinetics of the AB-layer is described by two differential equations with non-linear coupling, producing a good fit to the experimental data. It is shown that growth of the thickness of the AB-layer determines by dependence of chemical reaction rate on reactants concentration. In special case the thickness of the AB-layer can grow linearly or parabolically depending on that which of processes (interaction or the diffusion) controls the growth. The thickness of AB-layer as function of time is obtained. The moment of time (transition point) at which the linear growth are changed by parabolic is found.Keywords: phase formation, binary systems, interfacial reaction, diffusion, compound layers, growth kinetics
Procedia PDF Downloads 569355 Machine Vision System for Measuring the Quality of Bulk Sun-dried Organic Raisins
Authors: Navab Karimi, Tohid Alizadeh
Abstract:
An intelligent vision-based system was designed to measure the quality and purity of raisins. A machine vision setup was utilized to capture the images of bulk raisins in ranges of 5-50% mixed pure-impure berries. The textural features of bulk raisins were extracted using Grey-level Histograms, Co-occurrence Matrix, and Local Binary Pattern (a total of 108 features). Genetic Algorithm and neural network regression were used for selecting and ranking the best features (21 features). As a result, the GLCM features set was found to have the highest accuracy (92.4%) among the other sets. Followingly, multiple feature combinations of the previous stage were fed into the second regression (linear regression) to increase accuracy, wherein a combination of 16 features was found to be the optimum. Finally, a Support Vector Machine (SVM) classifier was used to differentiate the mixtures, producing the best efficiency and accuracy of 96.2% and 97.35%, respectively.Keywords: sun-dried organic raisin, genetic algorithm, feature extraction, ann regression, linear regression, support vector machine, south azerbaijan.
Procedia PDF Downloads 72354 Bitplanes Gray-Level Image Encryption Approach Using Arnold Transform
Authors: Ali Abdrhman M. Ukasha
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: SSPCE method, image compression-salt- peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption
Procedia PDF Downloads 434353 Distributed Perceptually Important Point Identification for Time Series Data Mining
Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung
Abstract:
In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining
Procedia PDF Downloads 431352 Robust Heart Sounds Segmentation Based on the Variation of the Phonocardiogram Curve Length
Authors: Mecheri Zeid Belmecheri, Maamar Ahfir, Izzet Kale
Abstract:
Automatic cardiac auscultation is still a subject of research in order to establish an objective diagnosis. Recorded heart sounds as Phonocardiogram signals (PCG) can be used for automatic segmentation into components that have clinical meanings. These are the first sound, S1, the second sound, S2, and the systolic and diastolic components, respectively. In this paper, an automatic method is proposed for the robust segmentation of heart sounds. This method is based on calculating an intermediate sawtooth-shaped signal from the length variation of the recorded Phonocardiogram (PCG) signal in the time domain and, using its positive derivative function that is a binary signal in training a Recurrent Neural Network (RNN). Results obtained in the context of a large database of recorded PCGs with their simultaneously recorded ElectroCardioGrams (ECGs) from different patients in clinical settings, including normal and abnormal subjects, show a segmentation testing performance average of 76 % sensitivity and 94 % specificity.Keywords: heart sounds, PCG segmentation, event detection, recurrent neural networks, PCG curve length
Procedia PDF Downloads 177351 Bitplanes Image Encryption/Decryption Using Edge Map (SSPCE Method) and Arnold Transform
Authors: Ali A. Ukasha
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: SSPCE method, image compression, salt and peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption
Procedia PDF Downloads 494350 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units
Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz
Abstract:
Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting
Procedia PDF Downloads 221349 Heteromolecular Structure Formation in Aqueous Solutions of Ethanol, Tetrahydrofuran and Dimethylformamide
Authors: Sh. Gofurov, O. Ismailova, U. Makhmanov, A. Kokhkharov
Abstract:
The refractometric method has been used to determine optical properties of concentration features of aqueous solutions of ethanol, tetrahydrofuran and dimethylformamide at the room temperature. Changes in dielectric permittivity of aqueous solutions of ethanol, tetrahydrofuran and dimethylformamide in a wide range of concentrations (0÷1.0 molar fraction) have been studied using molecular dynamics method. The curves depending on the concentration of experimental data on excess refractive indices and excess dielectric permittivity were compared. It has been shown that stable heteromolecular complexes in binary solutions are formed in the concentration range of 0.3÷0.4 mole fractions. The real and complex part of dielectric permittivity was obtained from dipole-dipole autocorrelation functions of molecules. At the concentrations of C = 0.3 / 0.4 m.f. the heteromolecular structures with hydrogen bonds are formed. This is confirmed by the extremum values of excessive dielectric permittivity and excessive refractive index of aqueous solutions.Keywords: refractometric method, aqueous solution, molecular dynamics, dielectric constant
Procedia PDF Downloads 261348 Multi-Spectral Medical Images Enhancement Using a Weber’s law
Authors: Muna F. Al-Sammaraie
Abstract:
The aim of this research is to present a multi spectral image enhancement methods used to achieve highly real digital image populates only a small portion of the available range of digital values. Also, a quantitative measure of image enhancement is presented. This measure is related with concepts of the Webers Low of the human visual system. For decades, several image enhancement techniques have been proposed. Although most techniques require profuse amount of advance and critical steps, the result for the perceive image are not as satisfied. This study involves changing the original values so that more of the available range is used; then increases the contrast between features and their backgrounds. It consists of reading the binary image on the basis of pixels taking them byte-wise and displaying it, calculating the statistics of an image, automatically enhancing the color of the image based on statistics calculation using algorithms and working with RGB color bands. Finally, the enhanced image is displayed along with image histogram. A number of experimental results illustrated the performance of these algorithms. Particularly the quantitative measure has helped to select optimal processing parameters: the best parameters and transform.Keywords: image enhancement, multi-spectral, RGB, histogram
Procedia PDF Downloads 326347 Clustering Based Level Set Evaluation for Low Contrast Images
Authors: Bikshalu Kalagadda, Srikanth Rangu
Abstract:
The important object of images segmentation is to extract objects with respect to some input features. One of the important methods for image segmentation is Level set method. Generally medical images and synthetic images with low contrast of pixel profile, for such images difficult to locate interested features in images. In conventional level set function, develops irregularity during its process of evaluation of contour of objects, this destroy the stability of evolution process. For this problem a remedy is proposed, a new hybrid algorithm is Clustering Level Set Evolution. Kernel fuzzy particles swarm optimization clustering with the Distance Regularized Level Set (DRLS) and Selective Binary, and Gaussian Filtering Regularized Level Set (SBGFRLS) methods are used. The ability of identifying different regions becomes easy with improved speed. Efficiency of the modified method can be evaluated by comparing with the previous method for similar specifications. Comparison can be carried out by considering medical and synthetic images.Keywords: segmentation, clustering, level set function, re-initialization, Kernel fuzzy, swarm optimization
Procedia PDF Downloads 351346 Flutter Control Analysis of an Aircraft Wing Using Carbon Nanotubes Reinforced Polymer
Authors: Timothee Gidenne, Xia Pinqi
Abstract:
In this paper, an investigation of the use of carbon nanotubes (CNTs) reinforced polymer as an actuator for an active flutter suppression to counter the flutter phenomena is conducted. The goal of this analysis is to establish a link between the behavior of the control surface and the actuators to demonstrate the veracity of using such a suppression system for the aeronautical field. A preliminary binary flutter model using simplified unsteady aerodynamics is developed to study the behavior of the wing while reaching the flutter speed and when the control system suppresses the flutter phenomena. The Timoshenko beam theory for bilayer materials is used to match the response of the control surface with the CNTs reinforced polymer (CNRP) actuators. According to Timoshenko theory, results show a good and realistic response for such a purpose. Even if the results are still preliminary, they show evidence of the potential use of CNRP for control surface actuation for the small-scale and lightweight system.Keywords: actuators, aeroelastic, aeroservoelasticity, carbon nanotubes, flutter, flutter suppression
Procedia PDF Downloads 128345 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application
Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro
Abstract:
This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.Keywords: item response theory, dimensionality, submodel theory, factorial analysis
Procedia PDF Downloads 371344 Telemedicine for Substance-Related Disorders: A Patient Satisfaction Survey among Individuals in Argentina
Authors: Badino Manuel, Farias Maria Alejandra
Abstract:
Telemedicine (TM) has the potential to develop efficient and cost-effective means for delivering quality health care services and outcomes, showing equal or, in some cases, better results than in-person treatment. To analyze patient satisfaction with the use of TM becomes relevant because this can affect the results of treatment and the adherence to it. The aim is to assess patient satisfaction with telemedicine for treating substance-related disorders in a mental health service in Córdoba, Argentina. A descriptive cross-sectional study was conducted among patients with substance-related disorders (N=115). A patient satisfaction survey was conducted from December 2021 to March 2022. For a total of 115 participants, 59,1% were male, 38,3% were female and 2,6% non-binary. In relation to educational status, 40% finished university, 39,1% high school, and 20,9 % only primary school. Regarding age, 4,3 % were young, 92,2% were adults, and 3,5% were elderly. Regarding TM treatment, 95,7% reported being satisfied. Furthermore, 85,2% of users declared that they would continueTM treatment, and 14,8% said that they would not resume TM treatment. To conclude, high levels of patient satisfaction contributes to the continuity of TM modality.Keywords: telemedicine, mental health, substance-related disorders, patient satisfaction
Procedia PDF Downloads 107343 Complex Decision Rules in the Form of Decision Trees
Authors: Avinash S. Jagtap, Sharad D. Gore, Rajendra G. Gurao
Abstract:
Decision rules become more and more complex as the number of conditions increase. As a consequence, the complexity of the decision rule also influences the time complexity of computer implementation of such a rule. Consider, for example, a decision that depends on four conditions A, B, C and D. For simplicity, suppose each of these four conditions is binary. Even then the decision rule will consist of 16 lines, where each line will be of the form: If A and B and C and D, then action 1. If A and B and C but not D, then action 2 and so on. While executing this decision rule, each of the four conditions will be checked every time until all the four conditions in a line are satisfied. The minimum number of logical comparisons is 4 whereas the maximum number is 64. This paper proposes to present a complex decision rule in the form of a decision tree. A decision tree divides the cases into branches every time a condition is checked. In the form of a decision tree, every branching eliminates half of the cases that do not satisfy the related conditions. As a result, every branch of the decision tree involves only four logical comparisons and hence is significantly simpler than the corresponding complex decision rule. The conclusion of this paper is that every complex decision rule can be represented as a decision tree and the decision tree is mathematically equivalent but computationally much simpler than the original complex decision ruleKeywords: strategic, tactical, operational, adaptive, innovative
Procedia PDF Downloads 284342 Probabilistic Gathering of Agents with Simple Sensors: Distributed Algorithm for Aggregation of Robots Equipped with Binary On-Board Detectors
Authors: Ariel Barel, Rotem Manor, Alfred M. Bruckstein
Abstract:
We present a probabilistic gathering algorithm for agents that can only detect the presence of other agents in front of or behind them. The agents act in the plane and are identical and indistinguishable, oblivious, and lack any means of direct communication. They do not have a common frame of reference in the plane and choose their orientation (direction of possible motion) at random. The analysis of the gathering process assumes that the agents act synchronously in selecting random orientations that remain fixed during each unit time-interval. Two algorithms are discussed. The first one assumes discrete jumps based on the sensing results given the randomly selected motion direction, and in this case, extensive experimental results exhibit probabilistic clustering into a circular region with radius equal to the step-size in time proportional to the number of agents. The second algorithm assumes agents with continuous sensing and motion, and in this case, we can prove gathering into a very small circular region in finite expected time.Keywords: control, decentralized, gathering, multi-agent, simple sensors
Procedia PDF Downloads 162341 Estimating Visitor’s Willingness to Pay for the Conservation Fund: Sustainable Financing Approach in Protected Areas in Ethiopia
Authors: Sintayehu Aynalem Aseres, Raminder Kaur Sira
Abstract:
Increasingly, protected areas have been confronting with inadequate conservation funds that make it tough to antithesis the continuing of annihilation. The problem is even grave in developing countries, where Protected Areas (Pas) are mainly government-administered. Subsequently, it needs a strong effort to toughen the self-financing capability of PAs by ripening alternative sources of sustainable financing for realizing the conservation goals, in particular, to save the remaining natural planet. This study, therefore, designed to estimate visitors’ willingness to pay (WTP) for the additional conservation fees using a contingent valuation method. The effect relationship between WTP and both socio-demographic and non-economic factors was scrutinized by binary logistic regression. The mean WTP of foreign visitors has estimated at US$ 7.4 and for that of domestic visitors at US$1, with annual aggregate revenue of US$29, 200. The WTP was strongly influenced by income, satisfaction, environmental concern and attitude. The study has policy implications for the conservationists and park authorities to estimate the non-use values of PAs for developing market-based conservation instruments.Keywords: conservation, ecotourism, sustainable financing, willingness to pay, protected areas, bale mountains national park
Procedia PDF Downloads 159340 A High Compression Ratio for a Losseless Image Compression Based on the Arithmetic Coding with the Sorted Run Length Coding: Meteosat Second Generation Image Compression
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is the heart of several multimedia techniques. It is used to reduce the number of bits required to represent an image. Meteosat Second Generation (MSG) satellite allows the acquisition of 12 image files every 15 minutes and that results in a large databases sizes. In this paper, a novel image compression method based on the arithmetic coding with the sorted Run Length Coding (SRLC) for MSG images is proposed. The SRLC allows us to find the occurrence of the consecutive pixels of the original image to create a sorted run. The arithmetic coding allows the encoding of the sorted data of the previous stage to retrieve a unique code word that represents a binary code stream in the sorted order to boost the compression ratio. Through this article, we show that our method can perform the best results concerning compression ratio and bit rate unlike the method based on the Run Length Coding (RLC) and the arithmetic coding. Evaluation criteria like the compression ratio and the bit rate allow the confirmation of the efficiency of our method of image compression.Keywords: image compression, arithmetic coding, Run Length Coding, RLC, Sorted Run Length Coding, SRLC, Meteosat Second Generation, MSG
Procedia PDF Downloads 351339 Cost Effective Real-Time Image Processing Based Optical Mark Reader
Authors: Amit Kumar, Himanshu Singal, Arnav Bhavsar
Abstract:
In this modern era of automation, most of the academic exams and competitive exams are Multiple Choice Questions (MCQ). The responses of these MCQ based exams are recorded in the Optical Mark Reader (OMR) sheet. Evaluation of the OMR sheet requires separate specialized machines for scanning and marking. The sheets used by these machines are special and costs more than a normal sheet. Available process is non-economical and dependent on paper thickness, scanning quality, paper orientation, special hardware and customized software. This study tries to tackle the problem of evaluating the OMR sheet without any special hardware and making the whole process economical. We propose an image processing based algorithm which can be used to read and evaluate the scanned OMR sheets with no special hardware required. It will eliminate the use of special OMR sheet. Responses recorded in normal sheet is enough for evaluation. The proposed system takes care of color, brightness, rotation, little imperfections in the OMR sheet images.Keywords: OMR, image processing, hough circle trans-form, interpolation, detection, binary thresholding
Procedia PDF Downloads 171338 The Impact of the Parking Spot’ Surroundings on Charging Decision: A Data-Driven Approach
Authors: Xizhen Zhou, Yanjie Ji
Abstract:
The charging behavior of drivers provides a reference for the planning and management of charging facilities. Based on the real trajectory data of electric vehicles, this study explored the influence of the surrounding environments of the parking spot on charging decisions. The built environment, the condition of vehicles, and the nearest charging station were all considered. And the mixed binary logit model was used to capture the impact of unobserved heterogeneity. The results show that the number of fast chargers in the charging station, parking price, dwell time, and shopping services all significantly impact the charging decision, while the leisure services, scenic spots, and mileage since the last charging are opposite. Besides, factors related to unobserved heterogeneity include the number of fast chargers, parking and charging prices, residential areas, etc. The interaction effects of random parameters further illustrate the complexity of charging choice behavior. The results provide insights for planning and managing charging facilities.Keywords: charging decision, trajectory, electric vehicle, infrastructure, mixed logit
Procedia PDF Downloads 69337 Density Measurement of Mixed Refrigerants R32+R1234yf and R125+R290 from 0°C to 100°C and at Pressures up to 10 MPa
Authors: Xiaoci Li, Yonghua Huang, Hui Lin
Abstract:
Optimization of the concentration of components in mixed refrigerants leads to potential improvement of either thermodynamic cycle performance or safety performance of heat pumps and refrigerators. R32+R1234yf and R125+R290 are two promising binary mixed refrigerants for the application of heat pumps working in the cold areas. The p-ρ-T data of these mixtures are one of the fundamental and necessary properties for design and evaluation of the performance of the heat pumps. Although the property data of mixtures can be predicted by the mixing models based on the pure substances incorporated in programs such as the NIST database Refprop, direct property measurement will still be helpful to reveal the true state behaviors and verify the models. Densities of the mixtures of R32+R1234yf an d R125+R290 are measured by an Anton Paar U shape oscillating tube digital densimeter DMA-4500 in the range of temperatures from 0°C to 100 °C and pressures up to 10 MPa. The accuracy of the measurement reaches 0.00005 g/cm³. The experimental data are compared with the predictions by Refprop in the corresponding range of pressure and temperature.Keywords: mixed refrigerant, density measurement, densimeter, thermodynamic property
Procedia PDF Downloads 294336 Determinants of Child Anthropometric Indicators: A Case Study of Mali in 2015
Authors: Davod Ahmadigheidari
Abstract:
The main objective of this study was to explore prevalence of anthropometric indicators as well the factors associated with the anthropometric indications in Mali. Data on 2015, downloaded from the website of Unicef, were analyzed. A total of 16,467 women (ages 15-49 years) and 16,467 children (ages 0-59 months) were selected for the sample. Different statistical analyses, such as descriptive, crosstabs and binary logistic regression form the basis of this study. Child anthropometric indicators (i.e., wasting, stunting, underweight and BMI for age) were used as the dependent variables. SPSS Syntax from WHO was used to create anthropometric indicators. Different factors, such as child’s sex, child’s age groups, child’s diseases symptoms (i.e., diarrhea, cough and fever), maternal education, household wealth index and area of residence were used as independent variables. Results showed more than forty percent of Malian households were in nutritional crises (stunting (42%) and underweight (34%). Findings from logistic regression analyses indicated that low score of wealth index, low maternal education and experience of diarrhea in last two weeks increase the probability of child malnutrition.Keywords: Mali, wasting, stunting, underweight, BMI for age and wealth index
Procedia PDF Downloads 153335 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis
Authors: Gon Park
Abstract:
Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.Keywords: cadastral data, green Infrastructure, network analysis, parcel data
Procedia PDF Downloads 204334 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance
Procedia PDF Downloads 486