Search results for: estimation algorithms
2607 Implementation of Proof of Work Using Ganache
Authors: Sakshi Singh, Shampa Chakraverty
Abstract:
One of the essential characteristics of Blockchain is the ability to validate the integrity of new transactions added to the Blockchain. Moreover, one of the essential consensus algorithms, Proof of Work, performs this job. In this work, we implemented the Proof of Work consensus method on the block formed by performing the transaction using Ganache. The primary goal of this implementation is to understand the process and record how Proof of Work works in reality on newly created blocks.Keywords: proof of work, blockchain, ganache, smart contract
Procedia PDF Downloads 1662606 Uplift Segmentation Approach for Targeting Customers in a Churn Prediction Model
Authors: Shivahari Revathi Venkateswaran
Abstract:
Segmenting customers plays a significant role in churn prediction. It helps the marketing team with proactive and reactive customer retention. For the reactive retention, the retention team reaches out to customers who already showed intent to disconnect by giving some special offers. When coming to proactive retention, the marketing team uses churn prediction model, which ranks each customer from rank 1 to 100, where 1 being more risk to churn/disconnect (high ranks have high propensity to churn). The churn prediction model is built by using XGBoost model. However, with the churn rank, the marketing team can only reach out to the customers based on their individual ranks. To profile different groups of customers and to frame different marketing strategies for targeted groups of customers are not possible with the churn ranks. For this, the customers must be grouped in different segments based on their profiles, like demographics and other non-controllable attributes. This helps the marketing team to frame different offer groups for the targeted audience and prevent them from disconnecting (proactive retention). For segmentation, machine learning approaches like k-mean clustering will not form unique customer segments that have customers with same attributes. This paper finds an alternate approach to find all the combination of unique segments that can be formed from the user attributes and then finds the segments who have uplift (churn rate higher than the baseline churn rate). For this, search algorithms like fast search and recursive search are used. Further, for each segment, all customers can be targeted using individual churn ranks from the churn prediction model. Finally, a UI (User Interface) is developed for the marketing team to interactively search for the meaningful segments that are formed and target the right set of audience for future marketing campaigns and prevent them from disconnecting.Keywords: churn prediction modeling, XGBoost model, uplift segments, proactive marketing, search algorithms, retention, k-mean clustering
Procedia PDF Downloads 712605 Genetic Variation in CYP4F2 and VKORC1: Pharmacogenomics Implications for Response to Warfarin
Authors: Zinhle Cindi, Collet Dandara, Mpiko Ntsekhe, Edson Makambwa, Miguel Larceda
Abstract:
Background: Warfarin is the most commonly used drug in the management of thromboembolic disease. However, there is a huge variability in the time, number of doses or starting doses for patients to achieve the required international normalised ratio (INR) which is compounded by a narrow therapeutic index. Many genetic-association studies have reported on European and Asian populations which have led to the designing of specific algorithms that are now being used to assist in warfarin dosing. However, very few or no studies have looked at the pharmacogenetics of warfarin in African populations, yet, huge differences in dosage requirements to reach the same INR have been observed. Objective: We set out to investigate the distribution of 3 SNPs CYP4F2 c.1347C > T, VKORC1 g.-1639G > A and VKORC1 c.1173C > T among South African Mixed Ancestry (MA) and Black African patients. Methods: DNA was extracted from 383 participants and subsequently genotyped using PCR/RFLP for the CYP4F2 c.1347 (V433M) (rs2108622), VKORC1 g.-1639 (rs9923231) and VKORC1 c.1173 (rs9934438) SNPs. Results: Comparing the Black and MA groups, significant differences were observed in the distribution of the following genotypes; CYP4F2 c.1347C/T (23% vs. 39% p=0.03). All VKORC1 g.-1639G > A genotypes (p < 0.006) and all VKORC1 c.1173C > T genotypes (p < 0.007). Conclusion: CYP4F2 c.1347T (V433M) reduces CYP4F2 protein levels and therefore expected to affect the amount of warfarin needed to block vitamin k recycling. The VKORC1 g-1639A variant alters transcriptional regulation therefore affecting the function of vitamin k epoxide reductase in vitamin k production. The VKORC1 c.1173T variant reduces the enzyme activity of VKORC1 consequently enhancing the effectiveness of warfarin. These are preliminary results; more genetic characterization is required to understand all the genetic determinants affecting how patients respond to warfarin.Keywords: algorithms, pharmacogenetics, thromboembolic disease, warfarin
Procedia PDF Downloads 2572604 Simulations to Predict Solar Energy Potential by ERA5 Application at North Africa
Authors: U. Ali Rahoma, Nabil Esawy, Fawzia Ibrahim Moursy, A. H. Hassan, Samy A. Khalil, Ashraf S. Khamees
Abstract:
The design of any solar energy conversion system requires the knowledge of solar radiation data obtained over a long period. Satellite data has been widely used to estimate solar energy where no ground observation of solar radiation is available, yet there are limitations on the temporal coverage of satellite data. Reanalysis is a “retrospective analysis” of the atmosphere parameters generated by assimilating observation data from various sources, including ground observation, satellites, ships, and aircraft observation with the output of NWP (Numerical Weather Prediction) models, to develop an exhaustive record of weather and climate parameters. The evaluation of the performance of reanalysis datasets (ERA-5) for North Africa against high-quality surface measured data was performed using statistical analysis. The estimation of global solar radiation (GSR) distribution over six different selected locations in North Africa during ten years from the period time 2011 to 2020. The root means square error (RMSE), mean bias error (MBE) and mean absolute error (MAE) of reanalysis data of solar radiation range from 0.079 to 0.222, 0.0145 to 0.198, and 0.055 to 0.178, respectively. The seasonal statistical analysis was performed to study seasonal variation of performance of datasets, which reveals the significant variation of errors in different seasons—the performance of the dataset changes by changing the temporal resolution of the data used for comparison. The monthly mean values of data show better performance, but the accuracy of data is compromised. The solar radiation data of ERA-5 is used for preliminary solar resource assessment and power estimation. The correlation coefficient (R2) varies from 0.93 to 99% for the different selected sites in North Africa in the present research. The goal of this research is to give a good representation for global solar radiation to help in solar energy application in all fields, and this can be done by using gridded data from European Centre for Medium-Range Weather Forecasts ECMWF and producing a new model to give a good result.Keywords: solar energy, solar radiation, ERA-5, potential energy
Procedia PDF Downloads 2112603 A New Graph Theoretic Problem with Ample Practical Applications
Authors: Mehmet Hakan Karaata
Abstract:
In this paper, we first coin a new graph theocratic problem with numerous applications. Second, we provide two algorithms for the problem. The first solution is using a brute-force techniques, whereas the second solution is based on an initial identification of the cycles in the given graph. We then provide a correctness proof of the algorithm. The applications of the problem include graph analysis, graph drawing and network structuring.Keywords: algorithm, cycle, graph algorithm, graph theory, network structuring
Procedia PDF Downloads 3862602 Issues on Optimizing the Structural Parameters of the Induction Converter
Authors: Marinka K. Baghdasaryan, Siranush M. Muradyan, Avgen A. Gasparyan
Abstract:
Analytical expressions of the current and angular errors, as well as the frequency characteristics of an induction converter describing the relation with its structural parameters, the core and winding characteristics are obtained. Based on estimation of the dependences obtained, a mathematical problem of parametric optimization is formulated which can successfully be used for investigation and diagnosing an induction converter.Keywords: induction converters, magnetic circuit material, current and angular errors, frequency response, mathematical formulation, structural parameters
Procedia PDF Downloads 3452601 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 2142600 Molecular Diversity of Forensically Relevant Insects from the Cadavers of Lahore
Authors: Sundus Mona, Atif Adnan, Babar Ali, Fareeha Arshad, Allah Rakha
Abstract:
Molecular diversity is the variation in the abundance of species. Forensic entomology is a neglected field in Pakistan. Insects collected from the crime scene should be handled by forensic entomologists who are currently virtually non-existent in Pakistan. Correct identification of insect specimen along with knowledge of their biodiversity can aid in solving many problems related to complicated forensic cases. Inadequate morphological identification and insufficient thermal biological studies limit the entomological utility in Forensic Medicine. Recently molecular identification of entomological evidence has gained attention globally. DNA barcoding is the latest and established method for species identification. Only proper identification can provide a precise estimation of postmortem intervals. Arthropods are known to be the first tourists scavenging on decomposing dead matter. The objective of the proposed study was to identify species by molecular techniques and analyze their phylogenetic importance with barcoded necrophagous insect species of early succession on human cadavers. Based upon this identification, the study outcomes will be the utilization of established DNA bar codes to identify carrion feeding insect species for concordant estimation of post mortem interval. A molecular identification method involving sequencing of a 658bp ‘barcode’ fragment of the mitochondrial cytochrome oxidase subunit 1 (CO1) gene from collected specimens of unknown dipteral species from cadavers of Lahore was evaluated. Nucleotide sequence divergences were calculated using MEGA 7 and Arlequin, and a neighbor-joining phylogenetic tree was generated. Three species were identified, Chrysomya megacephala, Chrysomya saffranea, and Chrysomya rufifacies with low genetic diversity. The fixation index was 0.83992 that suggests a need for further studies to identify and classify forensically relevant insects in Pakistan. There is an exigency demand for further research especially when immature forms of arthropods are recovered from the crime scene.Keywords: molecular diversity, DNA barcoding, species identification, forensically relevant
Procedia PDF Downloads 1492599 Hydraulic Characteristics of Mine Tailings by Metaheuristics Approach
Authors: Akhila Vasudev, Himanshu Kaushik, Tadikonda Venkata Bharat
Abstract:
A large number of mine tailings are produced every year as part of the extraction process of phosphates, gold, copper, and other materials. Mine tailings are high in water content and have very slow dewatering behavior. The efficient design of tailings dam and economical disposal of these slurries requires the knowledge of tailings consolidation behavior. The large-strain consolidation theory closely predicts the self-weight consolidation of these slurries as the theory considers the conservation of mass and momentum conservation and considers the hydraulic conductivity as a function of void ratio. Classical laboratory techniques, such as settling column test, seepage consolidation test, etc., are expensive and time-consuming for the estimation of hydraulic conductivity variation with void ratio. Inverse estimation of the constitutive relationships from the measured settlement versus time curves is explored. In this work, inverse analysis based on metaheuristics techniques will be explored for predicting the hydraulic conductivity parameters for mine tailings from the base excess pore water pressure dissipation curve and the initial conditions of the mine tailings. The proposed inverse model uses particle swarm optimization (PSO) algorithm, which is based on the social behavior of animals searching for food sources. The finite-difference numerical solution of the forward analytical model is integrated with the PSO algorithm to solve the inverse problem. The method is tested on synthetic data of base excess pore pressure dissipation curves generated using the finite difference method. The effectiveness of the method is verified using base excess pore pressure dissipation curve obtained from a settling column experiment and further ensured through comparison with available predicted hydraulic conductivity parameters.Keywords: base excess pore pressure, hydraulic conductivity, large strain consolidation, mine tailings
Procedia PDF Downloads 1362598 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1612597 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kumar Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform
Procedia PDF Downloads 1152596 Neural Reshaping: The Plasticity of Human Brain and Artificial Intelligence in the Learning Process
Authors: Seyed-Ali Sadegh-Zadeh, Mahboobe Bahrami, Sahar Ahmadi, Seyed-Yaser Mousavi, Hamed Atashbar, Amir M. Hajiyavand
Abstract:
This paper presents an investigation into the concept of neural reshaping, which is crucial for achieving strong artificial intelligence through the development of AI algorithms with very high plasticity. By examining the plasticity of both human and artificial neural networks, the study uncovers groundbreaking insights into how these systems adapt to new experiences and situations, ultimately highlighting the potential for creating advanced AI systems that closely mimic human intelligence. The uniqueness of this paper lies in its comprehensive analysis of the neural reshaping process in both human and artificial intelligence systems. This comparative approach enables a deeper understanding of the fundamental principles of neural plasticity, thus shedding light on the limitations and untapped potential of both human and AI learning capabilities. By emphasizing the importance of neural reshaping in the quest for strong AI, the study underscores the need for developing AI algorithms with exceptional adaptability and plasticity. The paper's findings have significant implications for the future of AI research and development. By identifying the core principles of neural reshaping, this research can guide the design of next-generation AI technologies that can enhance human and artificial intelligence alike. These advancements will be instrumental in creating a new era of AI systems with unparalleled capabilities, paving the way for improved decision-making, problem-solving, and overall cognitive performance. In conclusion, this paper makes a substantial contribution by investigating the concept of neural reshaping and its importance for achieving strong AI. Through its in-depth exploration of neural plasticity in both human and artificial neural networks, the study unveils vital insights that can inform the development of innovative AI technologies with high adaptability and potential for enhancing human and AI capabilities alike.Keywords: neural plasticity, brain adaptation, artificial intelligence, learning, cognitive reshaping
Procedia PDF Downloads 522595 Analytical Method Development and Validation of Stability Indicating Rp - Hplc Method for Detrmination of Atorvastatin and Methylcobalamine
Authors: Alkaben Patel
Abstract:
The proposed RP-HPLC method is easy, rapid, economical, precise and accurate stability indicating RP-HPLC method for simultaneous estimation of Astorvastatin and Methylcobalamine in their combined dosage form has been developed.The separation was achieved by LC-20 AT C18(250mm*4.6mm*2.6mm)Colum and water (pH 3.5): methanol 70:30 as mobile phase, at a flow rate of 1ml/min. wavelength of this dosage form is 215nm.The drug is related to stress condition of hydrolysis, oxidation, photolysis and thermal degradation.Keywords: RP- HPLC, atorvastatin, methylcobalamine, method, development, validation
Procedia PDF Downloads 3362594 A Framework of Dynamic Rule Selection Method for Dynamic Flexible Job Shop Problem by Reinforcement Learning Method
Authors: Rui Wu
Abstract:
In the volatile modern manufacturing environment, new orders randomly occur at any time, while the pre-emptive methods are infeasible. This leads to a real-time scheduling method that can produce a reasonably good schedule quickly. The dynamic Flexible Job Shop problem is an NP-hard scheduling problem that hybrid the dynamic Job Shop problem with the Parallel Machine problem. A Flexible Job Shop contains different work centres. Each work centre contains parallel machines that can process certain operations. Many algorithms, such as genetic algorithms or simulated annealing, have been proposed to solve the static Flexible Job Shop problems. However, the time efficiency of these methods is low, and these methods are not feasible in a dynamic scheduling problem. Therefore, a dynamic rule selection scheduling system based on the reinforcement learning method is proposed in this research, in which the dynamic Flexible Job Shop problem is divided into several parallel machine problems to decrease the complexity of the dynamic Flexible Job Shop problem. Firstly, the features of jobs, machines, work centres, and flexible job shops are selected to describe the status of the dynamic Flexible Job Shop problem at each decision point in each work centre. Secondly, a framework of reinforcement learning algorithm using a double-layer deep Q-learning network is applied to select proper composite dispatching rules based on the status of each work centre. Then, based on the selected composite dispatching rule, an available operation is selected from the waiting buffer and assigned to an available machine in each work centre. Finally, the proposed algorithm will be compared with well-known dispatching rules on objectives of mean tardiness, mean flow time, mean waiting time, or mean percentage of waiting time in the real-time Flexible Job Shop problem. The result of the simulations proved that the proposed framework has reasonable performance and time efficiency.Keywords: dynamic scheduling problem, flexible job shop, dispatching rules, deep reinforcement learning
Procedia PDF Downloads 1082593 Modeling and Mapping of Soil Erosion Risk Using Geographic Information Systems, Remote Sensing, and Deep Learning Algorithms: Case of the Oued Mikkes Watershed, Morocco
Authors: My Hachem Aouragh, Hind Ragragui, Abdellah El-Hmaidi, Ali Essahlaoui, Abdelhadi El Ouali
Abstract:
This study investigates soil erosion susceptibility in the Oued Mikkes watershed, located in the Meknes-Fez region of northern Morocco, utilizing advanced techniques such as deep learning algorithms and remote sensing integrated within Geographic Information Systems (GIS). Spanning approximately 1,920 km², the watershed is characterized by a semi-arid Mediterranean climate with irregular rainfall and limited water resources. The waterways within the watershed, especially the Oued Mikkes, are vital for agricultural irrigation and potable water supply. The research assesses the extent of erosion risk upstream of the Sidi Chahed dam while developing a spatial model of soil loss. Several important factors, including topography, land use/land cover, and climate, were analyzed, with data on slope, NDVI, and rainfall erosivity processed using deep learning models (DLNN, CNN, RNN). The results demonstrated excellent predictive performance, with AUC values of 0.92, 0.90, and 0.88 for DLNN, CNN, and RNN, respectively. The resulting susceptibility maps provide critical insights for soil management and conservation strategies, identifying regions at high risk for erosion across 24% of the study area. The most high-risk areas are concentrated on steep slopes, particularly near the Ifrane district and the surrounding mountains, while low-risk areas are located in flatter regions with less rugged topography. The combined use of remote sensing and deep learning offers a powerful tool for accurate erosion risk assessment and resource management in the Mikkes watershed, highlighting the implications of soil erosion on dam siltation and operational efficiency.Keywords: soil erosion, GIS, remote sensing, deep learning, Mikkes Watershed, Morocco
Procedia PDF Downloads 192592 Use of Magnesium as a Renewable Energy Source
Authors: Rafayel K. Kostanyan
Abstract:
The opportunities of use of metallic magnesium as a generator of hydrogen gas, as well as thermal and electric energy is presented in the paper. Various schemes of magnesium application are discussed and power characteristics of corresponding devices are presented. Economic estimation of hydrogen price obtained by different methods is made, including the use of magnesium as a source of hydrogen for transportation in comparison with gasoline. Details and prospects of our new inexpensive technology of magnesium production from magnesium hydroxide and magnesium bearing rocks (which are available worldwide and in Armenia) are analyzed. It is estimated the threshold cost of Mg production at which application of this metal in power engineering is economically justified.Keywords: energy, electrodialysis, magnesium, new technology
Procedia PDF Downloads 2712591 GPU Based Real-Time Floating Object Detection System
Authors: Jie Yang, Jian-Min Meng
Abstract:
A GPU-based floating object detection scheme is presented in this paper which is designed for floating mine detection tasks. This system uses contrast and motion information to eliminate as many false positives as possible while avoiding false negatives. The GPU computation platform is deployed to allow detecting objects in real-time. From the experimental results, it is shown that with certain configuration, the GPU-based scheme can speed up the computation up to one thousand times compared to the CPU-based scheme.Keywords: object detection, GPU, motion estimation, parallel processing
Procedia PDF Downloads 4742590 Interference of Mild Drought Stress on Estimation of Nitrogen Status in Winter Wheat by Some Vegetation Indices
Authors: H. Tavakoli, S. S. Mohtasebi, R. Alimardani, R. Gebbers
Abstract:
Nitrogen (N) is one of the most important agricultural inputs affecting crop growth, yield and quality in rain-fed cereal production. N demand of crops varies spatially across fields due to spatial differences in soil conditions. In addition, the response of a crop to the fertilizer applications is heavily reliant on plant available water. Matching N supply to water availability is thus essential to achieve an optimal crop response. The objective of this study was to determine effect of drought stress on estimation of nitrogen status of winter wheat by some vegetation indices. During the 2012 growing season, a field experiment was conducted at the Bundessortenamt (German Plant Variety Office) Marquardt experimental station which is located in the village of Marquardt about 5 km northwest of Potsdam, Germany (52°27' N, 12°57' E). The experiment was designed as a randomized split block design with two replications. Treatments consisted of four N fertilization rates (0, 60, 120 and 240 kg N ha-1, in total) and two water regimes (irrigated (Irr) and non-irrigated (NIrr)) in total of 16 plots with dimension of 4.5 × 9.0 m. The indices were calculated using readings of a spectroradiometer made of tec5 components. The main parts were two “Zeiss MMS1 nir enh” diode-array sensors with a nominal rage of 300 to 1150 nm with less than 10 nm resolutions and an effective range of 400 to 1000 nm. The following vegetation indices were calculated: NDVI, GNDVI, SR, MSR, NDRE, RDVI, REIP, SAVI, OSAVI, MSAVI, and PRI. All the experiments were conducted during the growing season in different plant growth stages including: stem elongation (BBCH=32-41), booting stage (BBCH=43), inflorescence emergence, heading (BBCH=56-58), flowering (BBCH=65-69), and development of fruit (BBCH=71). According to the results obtained, among the indices, NDRE and REIP were less affected by drought stress and can provide reliable wheat nitrogen status information, regardless of water status of the plant. They also showed strong relations with nitrogen status of winter wheat.Keywords: nitrogen status, drought stress, vegetation indices, precision agriculture
Procedia PDF Downloads 3192589 Data Mining Model for Predicting the Status of HIV Patients during Drug Regimen Change
Authors: Ermias A. Tegegn, Million Meshesha
Abstract:
Human Immunodeficiency Virus and Acquired Immunodeficiency Syndrome (HIV/AIDS) is a major cause of death for most African countries. Ethiopia is one of the seriously affected countries in sub Saharan Africa. Previously in Ethiopia, having HIV/AIDS was almost equivalent to a death sentence. With the introduction of Antiretroviral Therapy (ART), HIV/AIDS has become chronic, but manageable disease. The study focused on a data mining technique to predict future living status of HIV/AIDS patients at the time of drug regimen change when the patients become toxic to the currently taking ART drug combination. The data is taken from University of Gondar Hospital ART program database. Hybrid methodology is followed to explore the application of data mining on ART program dataset. Data cleaning, handling missing values and data transformation were used for preprocessing the data. WEKA 3.7.9 data mining tools, classification algorithms, and expertise are utilized as means to address the research problem. By using four different classification algorithms, (i.e., J48 Classifier, PART rule induction, Naïve Bayes and Neural network) and by adjusting their parameters thirty-two models were built on the pre-processed University of Gondar ART program dataset. The performances of the models were evaluated using the standard metrics of accuracy, precision, recall, and F-measure. The most effective model to predict the status of HIV patients with drug regimen substitution is pruned J48 decision tree with a classification accuracy of 98.01%. This study extracts interesting attributes such as Ever taking Cotrim, Ever taking TbRx, CD4 count, Age, Weight, and Gender so as to predict the status of drug regimen substitution. The outcome of this study can be used as an assistant tool for the clinician to help them make more appropriate drug regimen substitution. Future research directions are forwarded to come up with an applicable system in the area of the study.Keywords: HIV drug regimen, data mining, hybrid methodology, predictive model
Procedia PDF Downloads 1422588 Effect of Nicorandil, Bone Marrow-Derived Mesenchymal Stem Cells and Their Combination in Isoproterenol-Induced Heart Failure in Rats
Authors: Sarah Elsayed Mohammed, Lamiaa Ahmed Ahmed, Mahmoud Mohammed Khattab
Abstract:
Aim: The aim of the present study was to investigate whether combined nicorandil and bone marrow-derived mesenchymal stem cells (BMDMSC) treatment could offer an additional benefit in ameliorating isoproterenol (ISO)-induced heart failure in rats. Methods: ISO (85 and 170 mg/kg/day) was injected subcutaneously for 2 successive days, respectively. By day 3, electrocardiographic changes were recorded and serum was separated for determination of CK-MB level for confirmation of myocardial damage. Nicorandil (3 mg/kg/day) was then given orally with or without a single i.v. BMDMSC administration. Electrocardiography and echocardiography were recorded 2 weeks after beginning of treatment. Rats were then sacrificed and ventricles were isolated for estimation of vascular endothelial growth factor (VEGF), tumor necrosis factor-alpha (TNF-α) and transforming growth factor-beta (TGF-β) contents, caspase-3 activity as well as inducible nitric oxide synthase (iNOS) and connexin-43 protein expressions. Moreover, histological analysis of myocardial fibrosis was performed and cryosections were done for estimation of homing of BMDMSC. Results: ISO induced a significant increase in ventricles/body weight ratio, left ventricular end diastolic (LVEDD) and systolic dimensions (LVESD), ST segment and QRS duration. Moreover, myocardial fibrosis as well as VEGF, TNF-α and TGF-β contents were significantly increased. On the other hand, connexin-43 protein expression was significantly decreased, while caspase-3 and iNOS protein expressions were significantly increased. Combined therapy provided additional improvement compared to cell treatment alone towards reducing cardiac hypertrophy, fibrosis and inflammation. Furthermore, combined therapy induced significant increase in angiogenesis and BMDMSC homing and prevented ISO induced changes in iNOS, connexin-43 and caspase-3 protein expressions. Conclusion: Combined nicorandil/BMDMSC treatment was superior to BMDMSC alone towards preventing ISO-induced heart failure in rats.Keywords: fibrosis, isoproterenol, mesenchymal stem cells, nicorandil
Procedia PDF Downloads 5322587 3D Estimation of Synaptic Vesicle Distributions in Serial Section Transmission Electron Microscopy
Authors: Mahdieh Khanmohammadi, Sune Darkner, Nicoletta Nava, Jens Randel Nyengaard, Jon Sporring
Abstract:
We study the effect of stress on nervous system and we use two experimental groups of rats: sham rats and rats subjected to acute foot-shock stress. We investigate the synaptic vesicles density as a function of distance to the active zone in serial section transmission electron microscope images in 2 and 3 dimensions. By estimating the density in 2D and 3D we compare two groups of rats.Keywords: stress, 3-dimensional synaptic vesicle density, image registration, bioinformatics
Procedia PDF Downloads 2782586 Integrated Genetic-A* Graph Search Algorithm Decision Model for Evaluating Cost and Quality of School Renovation Strategies
Authors: Yu-Ching Cheng, Yi-Kai Juan, Daniel Castro
Abstract:
Energy consumption of buildings has been an increasing concern for researchers and practitioners in the last decade. Sustainable building renovation can reduce energy consumption and carbon dioxide emissions; meanwhile, it also can extend existing buildings useful life and facilitate environmental sustainability while providing social and economic benefits to the society. School buildings are different from other designed spaces as they are more crowded and host the largest portion of daily activities and occupants. Strategies that focus on reducing energy use but also improve the students’ learning environment becomes a significant subject in sustainable school buildings development. A decision model is developed in this study to solve complicated and large-scale combinational, discrete and determinate problems such as school renovation projects. The task of this model is to automatically search for the most cost-effective (lower cost and higher quality) renovation strategies. In this study, the search process of optimal school building renovation solutions is by nature a large-scale zero-one programming determinate problem. A* is suitable for solving deterministic problems due to its stable and effective search process, and genetic algorithms (GA) provides opportunities to acquire global optimal solutions in a short time via its indeterminate search process based on probability. These two algorithms are combined in this study to consider trade-offs between renovation cost and improved quality, this decision model is able to evaluate current school environmental conditions and suggest an optimal scheme of sustainable school buildings renovation strategies. Through adoption of this decision model, school managers can overcome existing limitations and transform school buildings into spaces more beneficial to students and friendly to the environment.Keywords: decision model, school buildings, sustainable renovation, genetic algorithm, A* search algorithm
Procedia PDF Downloads 1182585 National Assessment for Schools in Saudi Arabia: Score Reliability and Plausible Values
Authors: Dimiter M. Dimitrov, Abdullah Sadaawi
Abstract:
The National Assessment for Schools (NAFS) in Saudi Arabia consists of standardized tests in Mathematics, Reading, and Science for school grade levels 3, 6, and 9. One main goal is to classify students into four categories of NAFS performance (minimal, basic, proficient, and advanced) by schools and the entire national sample. The NAFS scoring and equating is performed on a bounded scale (D-scale: ranging from 0 to 1) in the framework of the recently developed “D-scoring method of measurement.” The specificity of the NAFS measurement framework and data complexity presented both challenges and opportunities to (a) the estimation of score reliability for schools, (b) setting cut-scores for the classification of students into categories of performance, and (c) generating plausible values for distributions of student performance on the D-scale. The estimation of score reliability at the school level was performed in the framework of generalizability theory (GT), with students “nested” within schools and test items “nested” within test forms. The GT design was executed via a multilevel modeling syntax code in R. Cut-scores (on the D-scale) for the classification of students into performance categories was derived via a recently developed method of standard setting, referred to as “Response Vector for Mastery” (RVM) method. For each school, the classification of students into categories of NAFS performance was based on distributions of plausible values for the students’ scores on NAFS tests by grade level (3, 6, and 9) and subject (Mathematics, Reading, and Science). Plausible values (on the D-scale) for each individual student were generated via random selection from a statistical logit-normal distribution with parameters derived from the student’s D-score and its conditional standard error, SE(D). All procedures related to D-scoring, equating, generating plausible values, and classification of students into performance levels were executed via a computer program in R developed for the purpose of NAFS data analysis.Keywords: large-scale assessment, reliability, generalizability theory, plausible values
Procedia PDF Downloads 192584 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence
Authors: Sogand Barghi
Abstract:
The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting
Procedia PDF Downloads 712583 Advanced Technologies and Algorithms for Efficient Portfolio Selection
Authors: Konstantinos Liagkouras, Konstantinos Metaxiotis
Abstract:
In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.Keywords: portfolio selection, optimization techniques, financial models, stochastic, heuristics
Procedia PDF Downloads 4322582 Parallel Multisplitting Methods for Differential Systems
Authors: Malika El Kyal, Ahmed Machmoum
Abstract:
We prove the superlinear convergence of asynchronous multi-splitting methods applied to differential equations. This study is based on the technique of nested sets. It permits to specify kind of the convergence in the asynchronous mode.The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.Keywords: parallel methods, asynchronous mode, multisplitting, ODE
Procedia PDF Downloads 5262581 Sensitivity Analysis in Fuzzy Linear Programming Problems
Authors: S. H. Nasseri, A. Ebrahimnejad
Abstract:
Fuzzy set theory has been applied to many fields, such as operations research, control theory, and management sciences. In this paper, we consider two classes of fuzzy linear programming (FLP) problems: Fuzzy number linear programming and linear programming with trapezoidal fuzzy variables problems. We state our recently established results and develop fuzzy primal simplex algorithms for solving these problems. Finally, we give illustrative examples.Keywords: fuzzy linear programming, fuzzy numbers, duality, sensitivity analysis
Procedia PDF Downloads 5652580 Automatic Approach for Estimating the Protection Elements of Electric Power Plants
Authors: Mahmoud Mohammad Salem Al-Suod, Ushkarenko O. Alexander, Dorogan I. Olga
Abstract:
New algorithms using microprocessor systems have been proposed for protection the diesel-generator unit in autonomous power systems. The software structure is designed to enhance the control automata of the system, in which every protection module of diesel-generator encapsulates the finite state machine.Keywords: diesel-generator unit, protection, state diagram, control system, algorithm, software components
Procedia PDF Downloads 4202579 Synchronization of Chaotic T-System via Optimal Control as an Adaptive Controller
Authors: Hossein Kheiri, Bashir Naderi, Mohamad Reza Niknam
Abstract:
In this paper we study the optimal synchronization of chaotic T-system with complete uncertain parameter. Optimal control laws and parameter estimation rules are obtained by using Hamilton-Jacobi-Bellman (HJB) technique and Lyapunov stability theorem. The derived control laws are optimal adaptive control and make the states of drive and response systems asymptotically synchronized. Numerical simulation shows the effectiveness and feasibility of the proposed method.Keywords: Lyapunov stability, synchronization, chaos, optimal control, adaptive control
Procedia PDF Downloads 4872578 Planning a Haemodialysis Process by Minimum Time Control of Hybrid Systems with Sliding Motion
Authors: Radoslaw Pytlak, Damian Suski
Abstract:
The aim of the paper is to provide a computational tool for planning a haemodialysis process. It is shown that optimization methods can be used to obtain the most effective treatment focused on removing both urea and phosphorus during the process. In order to achieve that, the IV–compartment model of phosphorus kinetics is applied. This kinetics model takes into account a rebound phenomenon that can occur during haemodialysis and results in a hybrid model of the process. Furthermore, vector fields associated with the model equations are such that it is very likely that using the most intuitive objective functions in the planning problem could lead to solutions which include sliding motions. Therefore, building computational tools for solving the problem of planning a haemodialysis process has required constructing numerical algorithms for solving optimal control problems with hybrid systems. The paper concentrates on minimum time control of hybrid systems since this control objective is the most suitable for the haemodialysis process considered in the paper. The presented approach to optimal control problems with hybrid systems is different from the others in several aspects. First of all, it is assumed that a hybrid system can exhibit sliding modes. Secondly, the system’s motion on the switching surface is described by index 2 differential–algebraic equations, and that guarantees accurate tracking of the sliding motion surface. Thirdly, the gradients of the problem’s functionals are evaluated with the help of adjoint equations. The adjoint equations presented in the paper take into account sliding motion and exhibit jump conditions at transition times. The optimality conditions in the form of the weak maximum principle for optimal control problems with hybrid systems exhibiting sliding modes and with piecewise constant controls are stated. The presented sensitivity analysis can be used to construct globally convergent algorithms for solving considered problems. The paper presents numerical results of solving the haemodialysis planning problem.Keywords: haemodialysis planning process, hybrid systems, optimal control, sliding motion
Procedia PDF Downloads 195