Search results for: equivalent transformation algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4365

Search results for: equivalent transformation algorithms

3795 Measuring Delay Using Software Defined Networks: Limitations, Challenges, and Suggestions for Openflow

Authors: Ahmed Alutaibi, Ganti Sudhakar

Abstract:

Providing better Quality-of-Service (QoS) to end users has been a challenging problem for researchers and service providers. Building applications relying on best effort network protocols hindered the adoption of guaranteed service parameters and, ultimately, Quality of Service. The introduction of Software Defined Networking (SDN) opened the door for a new paradigm shift towards a more controlled programmable configurable behavior. Openflow has been and still is the main implementation of the SDN vision. To facilitate better QoS for applications, the network must calculate and measure certain parameters. One of those parameters is the delay between the two ends of the connection. Using the power of SDN and the knowledge of application and network behavior, SDN networks can adjust to different conditions and specifications. In this paper, we use the capabilities of SDN to implement multiple algorithms to measure delay end-to-end not only inside the SDN network. The results of applying the algorithms on an emulated environment show that we can get measurements close to the emulated delay. The results also show that depending on the algorithm, load on the network and controller can differ. In addition, the transport layer handshake algorithm performs best among the tested algorithms. Out of the results and implementation, we show the limitations of Openflow and develop suggestions to solve them.

Keywords: software defined networking, quality of service, delay measurement, openflow, mininet

Procedia PDF Downloads 144
3794 Corporate Digital Responsibility in Construction Engineering-Construction 4.0: Ethical Guidelines for Digitization and Artificial Intelligence

Authors: Weber-Lewerenz Bianca

Abstract:

Digitization is developing fast and has become a powerful tool for digital planning, construction, and operations. Its transformation bears high potentials for companies, is critical for success, and thus, requires responsible handling. This study provides an assessment of calls made in the sustainable development goals by the United Nations (SDGs), White Papers on AI by international institutions, EU-Commission and German Government requesting for the consideration and protection of values and fundamental rights, the careful demarcation between machine (artificial) and human intelligence and the careful use of such technologies. The study discusses digitization and the impacts of artificial intelligence (AI) in construction engineering from an ethical perspective by generating data via conducting case studies and interviewing experts as part of the qualitative method. This research evaluates critically opportunities and risks revolving around corporate digital responsibility (CDR) in the construction industry. To the author's knowledge, no study has set out to investigate how CDR in construction could be conceptualized, especially in relation to the digitization and AI, to mitigate digital transformation both in large, medium-sized, and small companies. No study addressed the key research question: Where can CDR be allocated, how shall its adequate ethical framework be designed to support digital innovations in order to make full use of the potentials of digitization and AI? Now is the right timing for constructive approaches and apply ethics-by-design in order to develop and implement a safe and efficient AI. This represents the first study in construction engineering applying a holistic, interdisciplinary, inclusive approach to provide guidelines for orientation, examine benefits of AI and define ethical principles as the key driver for success, resources-cost-time efficiency, and sustainability using digital technologies and AI in construction engineering to enhance digital transformation. Innovative corporate organizations starting new business models are more likely to succeed than those dominated by conservative, traditional attitudes.

Keywords: construction engineering, digitization, digital transformation, artificial intelligence, ethics, corporate digital responsibility, digital innovation

Procedia PDF Downloads 215
3793 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms

Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager

Abstract:

This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.

Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties

Procedia PDF Downloads 44
3792 Stochastic Variation of the Hubble's Parameter Using Ornstein-Uhlenbeck Process

Authors: Mary Chriselda A

Abstract:

This paper deals with the fact that the Hubble's parameter is not constant and tends to vary stochastically with time. This premise has been proven by converting it to a stochastic differential equation using the Ornstein-Uhlenbeck process. The formulated stochastic differential equation is further solved analytically using the Euler and the Kolmogorov Forward equations, thereby obtaining the probability density function using the Fourier transformation, thereby proving that the Hubble's parameter varies stochastically. This is further corroborated by simulating the observations using Python and R-software for validation of the premise postulated. We can further draw conclusion that the randomness in forces affecting the white noise can eventually affect the Hubble’s Parameter leading to scale invariance and thereby causing stochastic fluctuations in the density and the rate of expansion of the Universe.

Keywords: Chapman Kolmogorov forward differential equations, fourier transformation, hubble's parameter, ornstein-uhlenbeck process , stochastic differential equations

Procedia PDF Downloads 185
3791 Mineralisation and Fluid Inclusions Studies of the Fluorite Deposit at Jebel Mecella, North Eastern Tunisia

Authors: Miladi Yasmine, Bouhlel Salah, Garnit Hechmi, David Banks

Abstract:

The Jebel Mecella F (Ba-Pb-Zn) ore deposits of the Zaghouan district are located in northeastern Tunisia, 60 km south of Tunis. The host rocks belong to the Ressas Formation of Kimmeridgian-Tithonian age and lower Cretaceous layers. Mineralisations occur as stratiform lenses and fracture fillings. The ore mineral assemblage is composed of fluorite, barite, sphalerite galena, and quartz. Primary fluid inclusions in sphalerite have homogenization temperatures ranging from 129 to 145°C final melting temperature range from -14.9 to -10.0, corresponding to salinities of 14.0 to 17.7 wt% NaCl equivalent. Fluid inclusions in fluorite homogenize to the liquid phase between 116 and 160°C. The final ice melting temperature ranges from -23 to -15 °C, corresponding to salinities between 17 and 24 wt% NaCl equivalent. The LAICP-MS analyses of the fluid inclusions in fluorite show that these fluids are dominated by Na>K>Mg. Furthermore, the high K/Na values from fluid inclusions suggest the brine interacted with K-rich rocks in the basement or in siliciclastic sediments in the basins. The ore fluids in Jebel Mecella are highly saline and Na-K dominated with lower Mg concentrations, and come from the leaching of the dolomitic host rocks. These results are compatible with Mississippi-Valley-type mineralizing fluids.

Keywords: Jebel Mecella, fluid inclusions, micro thermometry, LA-ICP-MS

Procedia PDF Downloads 174
3790 Remote Sensing through Deep Neural Networks for Satellite Image Classification

Authors: Teja Sai Puligadda

Abstract:

Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.

Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss

Procedia PDF Downloads 138
3789 Application of Deep Neural Networks to Assess Corporate Credit Rating

Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu

Abstract:

In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.

Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating

Procedia PDF Downloads 216
3788 Antioxidant Activity and Total Phenolic Content within the Aerial Parts of Artemisia absinthium

Authors: Hallal Nouria, Kharoubi Omar

Abstract:

Wormwood (Artemisia absinthium L.) is a medicinal and aromatic bitter herb, which has been used as a medicine from ancient times. It has traditionally been used as anthelmintic, choleretic, antiseptic, balsamic, depurative, digestive, diuretic, emmenagogue and in treating leukemia and sclerosis. The species was cited to be used externally as cataplasm of crushed leaves for snake and scorpion bites or decoction for wounds and sores applied locally as antiseptic and antifungal. Wormwood extract have high contents of total phenolic compounds and total flavonoids indicating that these compounds contribute to antiradical and antioxidative activity. Most of the degenerative diseases are caused by free radicals. Antioxidants are the agents responsible for scavenging free radicals. The aim of present study was to evaluate the phytochemical and in vitro antioxidant properties of Wormwood extract. DPPH assay and reducing power assay were the method adopted to study antioxidant potentials of extracts. Standard methods were used to screen preliminary phytochemistry and quantitative analysis of tannin, phenolics and flavanoids. Aqueous and alcoholic extracts were showed good antioxidant effect with IC50 ranges from 62 μg/ml for aqueous and 116μg/ml for alcoholic extracts. Phenolic compounds, tannins and flavonoids were the major phytochemicals present in both the extracts. Percentage of inhibition increased with the increased concentration of extracts. The aqueous and alcoholic extract yielded 20, 15& 3, 59 mg/g gallic acid equivalent phenolic content 2, 78 & 1,83 mg/g quercetin equivalent flavonoid and 2, 34 & 6, 40 g tannic acid equivalent tannins respectively. The aqueous and methanol extracts of the aerial parts showed a positive correlation between the total phenolic content and the antioxidant activity measured in the plant samples. The present study provides evidence that both extracts of Artemisia absinthium is a potential source of natural antioxidant.

Keywords: pharmaceutical industries, medicinal and aromatic plant, antioxidants, phenolic compounds, Artemisia absinthium

Procedia PDF Downloads 417
3787 The Impact of Artificial Intelligence on Digital Construction

Authors: Omil Nady Mahrous Maximous

Abstract:

The construction industry is currently experiencing a shift towards digitisation. This transformation is driven by adopting technologies like Building Information Modelling (BIM), drones, and augmented reality (AR). These advancements are revolutionizing the process of designing, constructing, and operating projects. BIM, for instance, is a new way of communicating and exploiting technology such as software and machinery. It enables the creation of a replica or virtual model of buildings or infrastructure projects. It facilitates simulating construction procedures, identifying issues beforehand, and optimizing designs accordingly. Drones are another tool in this revolution, as they can be utilized for site surveys, inspections, and even deliveries. Moreover, AR technology provides real-time information to workers involved in the project. Implementing these technologies in the construction industry has brought about improvements in efficiency, safety measures, and sustainable practices. BIM helps minimize rework and waste materials, while drones contribute to safety by reducing workers' exposure to areas. Additionally, AR plays a role in worker safety by delivering instructions and guidance during operations. Although the digital transformation within the construction industry is still in its early stages, it holds the potential to reshape project delivery methods entirely. By embracing these technologies, construction companies can boost their profitability while simultaneously reducing their environmental impact and ensuring safer practices.

Keywords: architectural education, construction industry, digital learning environments, immersive learning BIM, digital construction, construction technologies, digital transformation artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction

Procedia PDF Downloads 32
3786 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation

Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano

Abstract:

Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.

Keywords: machine learning, recommender system, software platform, support vector machine

Procedia PDF Downloads 118
3785 System for the Detecting of Fake Profiles on Online Social Networks Using Machine Learning and the Bio-Inspired Algorithms

Authors: Sekkal Nawel, Mahammed Nadir

Abstract:

The proliferation of online activities on Online Social Networks (OSNs) has captured significant user attention. However, this growth has been hindered by the emergence of fraudulent accounts that do not represent real individuals and violate privacy regulations within social network communities. Consequently, it is imperative to identify and remove these profiles to enhance the security of OSN users. In recent years, researchers have turned to machine learning (ML) to develop strategies and methods to tackle this issue. Numerous studies have been conducted in this field to compare various ML-based techniques. However, the existing literature still lacks a comprehensive examination, especially considering different OSN platforms. Additionally, the utilization of bio-inspired algorithms has been largely overlooked. Our study conducts an extensive comparison analysis of various fake profile detection techniques in online social networks. The results of our study indicate that supervised models, along with other machine learning techniques, as well as unsupervised models, are effective for detecting false profiles in social media. To achieve optimal results, we have incorporated six bio-inspired algorithms to enhance the performance of fake profile identification results.

Keywords: machine learning, bio-inspired algorithm, detection, fake profile, system, social network

Procedia PDF Downloads 49
3784 Comparative Analysis of Classical and Parallel Inpainting Algorithms Based on Affine Combinations of Projections on Convex Sets

Authors: Irina Maria Artinescu, Costin Radu Boldea, Eduard-Ionut Matei

Abstract:

The paper is a comparative study of two classical variants of parallel projection methods for solving the convex feasibility problem with their equivalents that involve variable weights in the construction of the solutions. We used a graphical representation of these methods for inpainting a convex area of an image in order to investigate their effectiveness in image reconstruction applications. We also presented a numerical analysis of the convergence of these four algorithms in terms of the average number of steps and execution time in classical CPU and, alternatively, in parallel GPU implementation.

Keywords: convex feasibility problem, convergence analysis, inpainting, parallel projection methods

Procedia PDF Downloads 157
3783 Evaluation of Different Cropping Systems under Organic, Inorganic and Integrated Production Systems

Authors: Sidramappa Gaddnakeri, Lokanath Malligawad

Abstract:

Any kind of research on production technology of individual crop / commodity /breed has not brought sustainability or stability in crop production. The sustainability of the system over years depends on the maintenance of the soil health. Organic production system includes use of organic manures, biofertilizers, green manuring for nutrient supply and biopesticides for plant protection helps to sustain the productivity even under adverse climatic condition. The study was initiated to evaluate the performance of different cropping systems under organic, inorganic and integrated production systems at The Institute of Organic Farming, University of Agricultural Sciences, Dharwad (Karnataka-India) under ICAR Network Project on Organic Farming. The trial was conducted for four years (2013-14 to 2016-17) on fixed site. Five cropping systems viz., sequence cropping of cowpea – safflower, greengram– rabi sorghum, maize-bengalgram, sole cropping of pigeonpea and intercropping of groundnut + cotton were evaluated under six nutrient management practices. The nutrient management practices are NM1 (100% Organic farming (Organic manures equivalent to 100% N (Cereals/cotton) or 100% P2O5 (Legumes), NM2 (75% Organic farming (Organic manures equivalent to 75% N (Cereals/cotton) or 100% P2O5 (Legumes) + Cow urine and Vermi-wash application), NM3 (Integrated farming (50% Organic + 50% Inorganic nutrients, NM4 (Integrated farming (75% Organic + 25% Inorganic nutrients, NM5 (100% Inorganic farming (Recommended dose of inorganic fertilizers)) and NM6 (Recommended dose of inorganic fertilizers + Recommended rate of farm yard manure (FYM). Among the cropping systems evaluated for different production systems indicated that the Groundnut + Hybrid cotton (2:1) intercropping system found more remunerative as compared to Sole pigeonpea cropping system, Greengram-Sorghum sequence cropping system, Maize-Chickpea sequence cropping system and Cowpea-Safflower sequence cropping system irrespective of the production systems. Production practices involving application of recommended rates of fertilizers + recommended rates of organic manures (Farmyard manure) produced higher net monetary returns and higher B:C ratio as compared to integrated production system involving application of 50 % organics + 50 % inorganic and application of 75 % organics + 25 % inorganic and organic production system only Both the two organic production systems viz., 100 % Organic production system (Organic manures equivalent to 100 % N (Cereals/cotton) or 100 % P2O5 (Legumes) and 75 % Organic production system (Organic manures equivalent to 75 % N (Cereals) or 100 % P2O5 (Legumes) + Cow urine and Vermi-wash application) are found to be on par. Further, integrated production system involving application of organic manures and inorganic fertilizers found more beneficial over organic production systems.

Keywords: cropping systems, production systems, cowpea, safflower, greengram, pigeonpea, groundnut, cotton

Procedia PDF Downloads 175
3782 Control of an Asymmetrical Design of a Pneumatically Actuated Ambidextrous Robot Hand

Authors: Emre Akyürek, Anthony Huynh, Tatiana Kalganova

Abstract:

The Ambidextrous Robot Hand is a robotic device with the purpose to mimic either the gestures of a right or a left hand. The symmetrical behavior of its fingers allows them to bend in one way or another keeping a compliant and anthropomorphic shape. However, in addition to gestures they can reproduce on both sides, an asymmetrical mechanical design with a three tendons routing has been engineered to reduce the number of actuators. As a consequence, control algorithms must be adapted to drive efficiently the ambidextrous fingers from one position to another and to include grasping features. These movements are controlled by pneumatic muscles, which are nonlinear actuators. As their elasticity constantly varies when they are under actuation, the length of pneumatic muscles and the force they provide may differ for a same value of pressurized air. The control algorithms introduced in this paper take both the fingers asymmetrical design and the pneumatic muscles nonlinearity into account to permit an accurate control of the Ambidextrous Robot Hand. The finger motion is achieved by combining a classic PID controller with a phase plane switching control that turns the gain constants into dynamic values. The grasping ability is made possible because of a sliding mode control that makes the fingers adapt to the shape of an object before strengthening their positions.

Keywords: ambidextrous hand, intelligent algorithms, nonlinear actuators, pneumatic muscles, robotics, sliding control

Procedia PDF Downloads 276
3781 The Formulation of the Mecelle and Other Codified Laws in the Ottoman Empire: Transformation Overturning the Sharia Principles

Authors: Tianqi Yin

Abstract:

The sharia had been the legislative basis in the Ottoman Empire since its emergence. The authority of sharia was superlative in the Islamic society compared to the power of the sulta, the nominal ruler of the nation, regulating essentially every aspect of people’s lives according to an ethical code. In modernity, however, as European sovereignty employed forces to re-engineer the Islamic world to make it more like their own, a society ruled by a state, the Ottoman legislation system encountered a great challenge of adopting codified laws to replace sharia with the formulation of the Mecelle being a prominent case. Interpretations of this transformation have been contentious, with the key debate revolving around whether these codified laws are authentic representations of sharia or alien legal formulations authorized by the modern nation-state under heavy European colonial influence. Because of the difference in methodology of the diverse theories, challenges toward having a universal conclusion on this issue remain. This paper argues that the formulation of the Mecelle and other codified laws is a discontinuity of sharia due to European modernity’s influence and that the emphasis on elements of Islamic laws is a tactic employed to promote this process. These codified laws signals a complete social transformation from the Islamic society ruled by the sharia to a replication of the European society that is ruled by a comprehensive ruling system of the modern state. In addition to advancing the discussion on the characterization of the codification movement in the Ottoman Empire in modernity, the research also promotes the determination of the nature of the modern codification movement globally.

Keywords: codification, mecelle, modernity, sharia, ottoman empire

Procedia PDF Downloads 74
3780 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand

Authors: Jefferson Hernandez, Juan Padilla

Abstract:

Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.

Keywords: price elasticity, volume, correlation structures, Bayesian models

Procedia PDF Downloads 142
3779 Medical Image Compression Based on Region of Interest: A Review

Authors: Sudeepti Dayal, Neelesh Gupta

Abstract:

In terms of transmission, bigger the size of any image, longer the time the channel takes for transmission. It is understood that the bandwidth of the channel is fixed. Therefore, if the size of an image is reduced, a larger number of data or images can be transmitted over the channel. Compression is the technique used to reduce the size of an image. In terms of storage, compression reduces the file size which it occupies on the disk. Any image is based on two parameters, region of interest and non-region of interest. There are several algorithms of compression that compress the data more economically. In this paper we have reviewed region of interest and non-region of interest based compression techniques and the algorithms which compress the image most efficiently.

Keywords: compression ratio, region of interest, DCT, DWT

Procedia PDF Downloads 358
3778 Online Battery Equivalent Circuit Model Estimation on Continuous-Time Domain Using Linear Integral Filter Method

Authors: Cheng Zhang, James Marco, Walid Allafi, Truong Q. Dinh, W. D. Widanage

Abstract:

Equivalent circuit models (ECMs) are widely used in battery management systems in electric vehicles and other battery energy storage systems. The battery dynamics and the model parameters vary under different working conditions, such as different temperature and state of charge (SOC) levels, and therefore online parameter identification can improve the modelling accuracy. This paper presents a way of online ECM parameter identification using a continuous time (CT) estimation method. The CT estimation method has several advantages over discrete time (DT) estimation methods for ECM parameter identification due to the widely separated battery dynamic modes and fast sampling. The presented method can be used for online SOC estimation. Test data are collected using a lithium ion cell, and the experimental results show that the presented CT method achieves better modelling accuracy compared with the conventional DT recursive least square method. The effectiveness of the presented method for online SOC estimation is also verified on test data.

Keywords: electric circuit model, continuous time domain estimation, linear integral filter method, parameter and SOC estimation, recursive least square

Procedia PDF Downloads 362
3777 Analysis of Brain Signals Using Neural Networks Optimized by Co-Evolution Algorithms

Authors: Zahra Abdolkarimi, Naser Zourikalatehsamad,

Abstract:

Up to 40 years ago, after recognition of epilepsy, it was generally believed that these attacks occurred randomly and suddenly. However, thanks to the advance of mathematics and engineering, such attacks can be predicted within a few minutes or hours. In this way, various algorithms for long-term prediction of the time and frequency of the first attack are presented. In this paper, by considering the nonlinear nature of brain signals and dynamic recorded brain signals, ANFIS model is presented to predict the brain signals, since according to physiologic structure of the onset of attacks, more complex neural structures can better model the signal during attacks. Contribution of this work is the co-evolution algorithm for optimization of ANFIS network parameters. Our objective is to predict brain signals based on time series obtained from brain signals of the people suffering from epilepsy using ANFIS. Results reveal that compared to other methods, this method has less sensitivity to uncertainties such as presence of noise and interruption in recorded signals of the brain as well as more accuracy. Long-term prediction capacity of the model illustrates the usage of planted systems for warning medication and preventing brain signals.

Keywords: co-evolution algorithms, brain signals, time series, neural networks, ANFIS model, physiologic structure, time prediction, epilepsy suffering, illustrates model

Procedia PDF Downloads 257
3776 A Fast and Robust Protocol for Reconstruction and Re-Enactment of Historical Sites

Authors: Sanaa I. Abu Alasal, Madleen M. Esbeih, Eman R. Fayyad, Rami S. Gharaibeh, Mostafa Z. Ali, Ahmed A. Freewan, Monther M. Jamhawi

Abstract:

This research proposes a novel reconstruction protocol for restoring missing surfaces and low-quality edges and shapes in photos of artifacts at historical sites. The protocol starts with the extraction of a cloud of points. This extraction process is based on four subordinate algorithms, which differ in the robustness and amount of resultant. Moreover, they use different -but complementary- accuracy to some related features and to the way they build a quality mesh. The performance of our proposed protocol is compared with other state-of-the-art algorithms and toolkits. The statistical analysis shows that our algorithm significantly outperforms its rivals in the resultant quality of its object files used to reconstruct the desired model.

Keywords: meshes, point clouds, surface reconstruction protocols, 3D reconstruction

Procedia PDF Downloads 439
3775 Implicit Transaction Costs and the Fundamental Theorems of Asset Pricing

Authors: Erindi Allaj

Abstract:

This paper studies arbitrage pricing theory in financial markets with transaction costs. We extend the existing theory to include the more realistic possibility that the price at which the investors trade is dependent on the traded volume. The investors in the market always buy at the ask and sell at the bid price. Transaction costs are composed of two terms, one is able to capture the implicit transaction costs and the other the price impact. Moreover, a new definition of a self-financing portfolio is obtained. The self-financing condition suggests that continuous trading is possible, but is restricted to predictable trading strategies which have left and right limit and finite quadratic variation. That is, predictable trading strategies of infinite variation and of finite quadratic variation are allowed in our setting. Within this framework, the existence of an equivalent probability measure is equivalent to the absence of arbitrage opportunities, so that the first fundamental theorem of asset pricing (FFTAP) holds. It is also proved that, when this probability measure is unique, any contingent claim in the market is hedgeable in an L2-sense. The price of any contingent claim is equal to the risk-neutral price. To better understand how to apply the theory proposed we provide an example with linear transaction costs.

Keywords: arbitrage pricing theory, transaction costs, fundamental theorems of arbitrage, financial markets

Procedia PDF Downloads 342
3774 Solving a Micromouse Maze Using an Ant-Inspired Algorithm

Authors: Rolando Barradas, Salviano Soares, António Valente, José Alberto Lencastre, Paulo Oliveira

Abstract:

This article reviews the Ant Colony Optimization, a nature-inspired algorithm, and its implementation in the Scratch/m-Block programming environment. The Ant Colony Optimization is a part of Swarm Intelligence-based algorithms and is a subset of biological-inspired algorithms. Starting with a problem in which one has a maze and needs to find its path to the center and return to the starting position. This is similar to an ant looking for a path to a food source and returning to its nest. Starting with the implementation of a simple wall follower simulator, the proposed solution uses a dynamic graphical interface that allows young students to observe the ants’ movement while the algorithm optimizes the routes to the maze’s center. Things like interface usability, Data structures, and the conversion of algorithmic language to Scratch syntax were some of the details addressed during this implementation. This gives young students an easier way to understand the computational concepts of sequences, loops, parallelism, data, events, and conditionals, as they are used through all the implemented algorithms. Future work includes the simulation results with real contest mazes and two different pheromone update methods and the comparison with the optimized results of the winners of each one of the editions of the contest. It will also include the creation of a Digital Twin relating the virtual simulator with a real micromouse in a full-size maze. The first test results show that the algorithm found the same optimized solutions that were found by the winners of each one of the editions of the Micromouse contest making this a good solution for maze pathfinding.

Keywords: nature inspired algorithms, scratch, micromouse, problem-solving, computational thinking

Procedia PDF Downloads 104
3773 Multicenter Evaluation of the ACCESS HBsAg and ACCESS HBsAg Confirmatory Assays on the DxI 9000 ACCESS Immunoassay Analyzer, for the Detection of Hepatitis B Surface Antigen

Authors: Vanessa Roulet, Marc Turini, Juliane Hey, Stéphanie Bord-Romeu, Emilie Bonzom, Mahmoud Badawi, Mohammed-Amine Chakir, Valérie Simon, Vanessa Viotti, Jérémie Gautier, Françoise Le Boulaire, Catherine Coignard, Claire Vincent, Sandrine Greaume, Isabelle Voisin

Abstract:

Background: Beckman Coulter, Inc. has recently developed fully automated assays for the detection of HBsAg on a new immunoassay platform. The objective of this European multicenter study was to evaluate the performance of the ACCESS HBsAg and ACCESS HBsAg Confirmatory assays† on the recently CE-marked DxI 9000 ACCESS Immunoassay Analyzer. Methods: The clinical specificity of the ACCESS HBsAg and HBsAg Confirmatory assays was determined using HBsAg-negative samples from blood donors and hospitalized patients. The clinical sensitivity was determined using presumed HBsAg-positive samples. Sample HBsAg status was determined using a CE-marked HBsAg assay (Abbott ARCHITECT HBsAg Qualitative II, Roche Elecsys HBsAg II, or Abbott PRISM HBsAg assay) and a CE-marked HBsAg confirmatory assay (Abbott ARCHITECT HBsAg Qualitative II Confirmatory or Abbott PRISM HBsAg Confirmatory assay) according to manufacturer package inserts and pre-determined testing algorithms. False initial reactive rate was determined on fresh hospitalized patient samples. The sensitivity for the early detection of HBV infection was assessed internally on thirty (30) seroconversion panels. Results: Clinical specificity was 99.95% (95% CI, 99.86 – 99.99%) on 6047 blood donors and 99.71% (95%CI, 99.15 – 99.94%) on 1023 hospitalized patient samples. A total of six (6) samples were found false positive with the ACCESS HBsAg assay. None were confirmed for the presence of HBsAg with the ACCESS HBsAg Confirmatory assay. Clinical sensitivity on 455 HBsAg-positive samples was 100.00% (95% CI, 99.19 – 100.00%) for the ACCESS HBsAg assay alone and for the ACCESS HBsAg Confirmatory assay. The false initial reactive rate on 821 fresh hospitalized patient samples was 0.24% (95% CI, 0.03 – 0.87%). Results obtained on 30 seroconversion panels demonstrated that the ACCESS HBsAg assay had equivalent sensitivity performances compared to the Abbott ARCHITECT HBsAg Qualitative II assay with an average bleed difference since first reactive bleed of 0.13. All bleeds found reactive in ACCESS HBsAg assay were confirmed in ACCESS HBsAg Confirmatory assay. Conclusion: The newly developed ACCESS HBsAg and ACCESS HBsAg Confirmatory assays from Beckman Coulter have demonstrated high clinical sensitivity and specificity, equivalent to currently marketed HBsAg assays, as well as a low false initial reactive rate. †Pending achievement of CE compliance; not yet available for in vitro diagnostic use. 2023-11317 Beckman Coulter and the Beckman Coulter product and service marks mentioned herein are trademarks or registered trademarks of Beckman Coulter, Inc. in the United States and other countries. All other trademarks are the property of their respective owners.

Keywords: dxi 9000 access immunoassay analyzer, hbsag, hbv, hepatitis b surface antigen, hepatitis b virus, immunoassay

Procedia PDF Downloads 67
3772 pscmsForecasting: A Python Web Service for Time Series Forecasting

Authors: Ioannis Andrianakis, Vasileios Gkatas, Nikos Eleftheriadis, Alexios Ellinidis, Ermioni Avramidou

Abstract:

pscmsForecasting is an open-source web service that implements a variety of time series forecasting algorithms and exposes them to the user via the ubiquitous HTTP protocol. It allows developers to enhance their applications by adding time series forecasting functionalities through an intuitive and easy-to-use interface. This paper provides some background on time series forecasting and gives details about the implemented algorithms, aiming to enhance the end user’s understanding of the underlying methods before incorporating them into their applications. A detailed description of the web service’s interface and its various parameterizations is also provided. Being an open-source project, pcsmsForecasting can also be easily modified and tailored to the specific needs of each application.

Keywords: time series, forecasting, web service, open source

Procedia PDF Downloads 61
3771 An Approach to Building a Recommendation Engine for Travel Applications Using Genetic Algorithms and Neural Networks

Authors: Adrian Ionita, Ana-Maria Ghimes

Abstract:

The lack of features, design and the lack of promoting an integrated booking application are some of the reasons why most online travel platforms only offer automation of old booking processes, being limited to the integration of a smaller number of services without addressing the user experience. This paper represents a practical study on how to improve travel applications creating user-profiles through data-mining based on neural networks and genetic algorithms. Choices made by users and their ‘friends’ in the ‘social’ network context can be considered input data for a recommendation engine. The purpose of using these algorithms and this design is to improve user experience and to deliver more features to the users. The paper aims to highlight a broader range of improvements that could be applied to travel applications in terms of design and service integration, while the main scientific approach remains the technical implementation of the neural network solution. The motivation of the technologies used is also related to the initiative of some online booking providers that have made the fact that they use some ‘neural network’ related designs public. These companies use similar Big-Data technologies to provide recommendations for hotels, restaurants, and cinemas with a neural network based recommendation engine for building a user ‘DNA profile’. This implementation of the ‘profile’ a collection of neural networks trained from previous user choices, can improve the usability and design of any type of application.

Keywords: artificial intelligence, big data, cloud computing, DNA profile, genetic algorithms, machine learning, neural networks, optimization, recommendation system, user profiling

Procedia PDF Downloads 145
3770 An Expert System Designed to Be Used with MOEAs for Efficient Portfolio Selection

Authors: Kostas Metaxiotis, Kostas Liagkouras

Abstract:

This study presents an Expert System specially designed to be used with Multiobjective Evolutionary Algorithms (MOEAs) for the solution of the portfolio selection problem. The validation of the proposed hybrid System is done by using data sets from Hang Seng 31 in Hong Kong, DAX 100 in Germany and FTSE 100 in UK. The performance of the proposed system is assessed in comparison with the Non-dominated Sorting Genetic Algorithm II (NSGAII). The evaluation of the performance is based on different performance metrics that evaluate both the proximity of the solutions to the Pareto front and their dispersion on it. The results show that the proposed hybrid system is efficient for the solution of this kind of problems.

Keywords: expert systems, multi-objective optimization, evolutionary algorithms, portfolio selection

Procedia PDF Downloads 419
3769 An Optimized Association Rule Mining Algorithm

Authors: Archana Singh, Jyoti Agarwal, Ajay Rana

Abstract:

Data Mining is an efficient technology to discover patterns in large databases. Association Rule Mining techniques are used to find the correlation between the various item sets in a database, and this co-relation between various item sets are used in decision making and pattern analysis. In recent years, the problem of finding association rules from large datasets has been proposed by many researchers. Various research papers on association rule mining (ARM) are studied and analyzed first to understand the existing algorithms. Apriori algorithm is the basic ARM algorithm, but it requires so many database scans. In DIC algorithm, less amount of database scan is needed but complex data structure lattice is used. The main focus of this paper is to propose a new optimized algorithm (Friendly Algorithm) and compare its performance with the existing algorithms A data set is used to find out frequent itemsets and association rules with the help of existing and proposed (Friendly Algorithm) and it has been observed that the proposed algorithm also finds all the frequent itemsets and essential association rules from databases as compared to existing algorithms in less amount of database scan. In the proposed algorithm, an optimized data structure is used i.e. Graph and Adjacency Matrix.

Keywords: association rules, data mining, dynamic item set counting, FP-growth, friendly algorithm, graph

Procedia PDF Downloads 397
3768 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 92
3767 Managing Organizational Change for a Transformation Project: The Billing and Customer Relationship Management Journey

Authors: Sharifah I. N. A. Syed Azmi, Nazarina Mohd Nasir

Abstract:

The Billing & Customer Relationship Management (BCRM) project is an important enabler towards realizing customer experience transformation. It involves technological shifts for future scalability, revision of multiple business processes and adoption of change by the users and impacted employees. This massive transition, if not managed properly, may result in the decline of business performance due to productivity drop. Organizational change management is an essential element in BCRM project implementation to ensure the system is well understood and embraced by all stakeholders. In order to move impacted employees from unaware state or denial mode to full-acceptance mindset and committing themselves in using the new system, their involvement in the whole change process starting from the initial stage is imperative. Through the BCRM Change Management Plan, a holistic approach was taken whereby the strategy and program for five key components namely executive sponsorship, continuous communication, process change readiness, organizational readiness and individual readiness were all carefully established. Roles of the project sponsor, change agents, change ambassadors and community of practice (CoP) were clearly defined in gaining high commitment and support across the entire organization. Continuous communication and engagement initiatives throughout project implementation have been carried out to reach all stakeholders. The business readiness was constantly monitored and assessed including effectiveness of end-user training, thorough review of process documentation and completion of roles realignment exercise.

Keywords: BCRM, change management, organizational change, transformation project

Procedia PDF Downloads 121
3766 Isolation and Culture of Keratinocytes and Fibroblasts to Develop Artificial Skin Equivalent in Cats

Authors: Lavrentiadou S. N., Angelou V., Chatzimisios K., Papazoglou L.

Abstract:

The aim of this study was the isolation and culture of keratinocytes and fibroblasts from feline skin to ultimately create an artificial engineered skin (including dermis and epidermis) useful for the effective treatment of large cutaneous deficits in cats. Epidermal keratinocytes and dermal fibroblasts were freshly isolated from skin biopsies using an 8 mm biopsy punch obtained from 8 healthy cats that had undergone ovariohysterectomy. The owner’s consent was obtained. All cats had a complete blood count and a serum biochemical analysis and were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) preoperatively. The samples were cut into small pieces and incubated with collagenase (2 mg/ml) for 5-6 hours. Following digestion, cutaneous cells were filtered through a 100 μm cell strainer, washed with DMEM, and grown in DMEM supplemented with 10% FBS. The undigested epidermis was washed with DMEM and incubated with 0.05% Trypsin/0.02% EDTA (TE) solution. Keratinocytes recovered in the TE solution were filtered through a 100 μm and a 40 μm cell strainer and, following washing, were grown on a collagen type I matrix in DMEM: F12 (3:1) medium supplemented with 10% FΒS, 1 μm hydrocortisone, 1 μm isoproterenol and 0.1 μm insulin. Both fibroblasts and keratinocytes were grown in a humidified atmosphere with 5% CO2 at 37oC. The medium was changed twice a week and cells were cultured up to passage 4. Cells were grown to 70-85% confluency, at which point they were trypsinized and subcultured in a 1:4 dilution. The majority of the cells in each passage were transferred to a freezing medium and stored at -80oC. Fibroblasts were frozen in DMEM supplemented with 30% FBS and 10% DMSO, whereas keratinocytes were frozen in a complete keratinocyte growth medium supplemented with 10% DMSO. Both cell types were thawed and successfully grown as described above. Therefore, we can create a bank of fibroblasts and keratinocytes, from which we can recover cells for further culture and use for the generation of skin equivalent in vitro. In conclusion, cutaneous cell isolation and cell culture and expansion were successfully developed. To the authors’ best knowledge, this is the first study reporting isolation and culture of keratinocytes and fibroblasts from feline skin. However, these are preliminary results and thus, the development of autologous-engineered feline skin is still in process.

Keywords: cat, fibroblasts, keratinocytes, skin equivalent, wound

Procedia PDF Downloads 90