Search results for: data optimization
26183 A Modular Solution for Large-Scale Critical Industrial Scheduling Problems with Coupling of Other Optimization Problems
Authors: Ajit Rai, Hamza Deroui, Blandine Vacher, Khwansiri Ninpan, Arthur Aumont, Francesco Vitillo, Robert Plana
Abstract:
Large-scale critical industrial scheduling problems are based on Resource-Constrained Project Scheduling Problems (RCPSP), that necessitate integration with other optimization problems (e.g., vehicle routing, supply chain, or unique industrial ones), thus requiring practical solutions (i.e., modular, computationally efficient with feasible solutions). To the best of our knowledge, the current industrial state of the art is not addressing this holistic problem. We propose an original modular solution that answers the issues exhibited by the delivery of complex projects. With three interlinked entities (project, task, resources) having their constraints, it uses a greedy heuristic with a dynamic cost function for each task with a situational assessment at each time step. It handles large-scale data and can be easily integrated with other optimization problems, already existing industrial tools and unique constraints as required by the use case. The solution has been tested and validated by domain experts on three use cases: outage management in Nuclear Power Plants (NPPs), planning of future NPP maintenance operation, and application in the defense industry on supply chain and factory relocation. In the first use case, the solution, in addition to the resources’ availability and tasks’ logical relationships, also integrates several project-specific constraints for outage management, like, handling of resource incompatibility, updating of tasks priorities, pausing tasks in a specific circumstance, and adjusting dynamic unit of resources. With more than 20,000 tasks and multiple constraints, the solution provides a feasible schedule within 10-15 minutes on a standard computer device. This time-effective simulation corresponds with the nature of the problem and requirements of several scenarios (30-40 simulations) before finalizing the schedules. The second use case is a factory relocation project where production lines must be moved to a new site while ensuring the continuity of their production. This generates the challenge of merging job shop scheduling and the RCPSP with location constraints. Our solution allows the automation of the production tasks while considering the rate expectation. The simulation algorithm manages the use and movement of resources and products to respect a given relocation scenario. The last use case establishes a future maintenance operation in an NPP. The project contains complex and hard constraints, like on Finish-Start precedence relationship (i.e., successor tasks have to start immediately after predecessors while respecting all constraints), shareable coactivity for managing workspaces, and requirements of a specific state of "cyclic" resources (they can have multiple states possible with only one at a time) to perform tasks (can require unique combinations of several cyclic resources). Our solution satisfies the requirement of minimization of the state changes of cyclic resources coupled with the makespan minimization. It offers a solution of 80 cyclic resources with 50 incompatibilities between levels in less than a minute. Conclusively, we propose a fast and feasible modular approach to various industrial scheduling problems that were validated by domain experts and compatible with existing industrial tools. This approach can be further enhanced by the use of machine learning techniques on historically repeated tasks to gain further insights for delay risk mitigation measures.Keywords: deterministic scheduling, optimization coupling, modular scheduling, RCPSP
Procedia PDF Downloads 19726182 Healthcare Big Data Analytics Using Hadoop
Authors: Chellammal Surianarayanan
Abstract:
Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare
Procedia PDF Downloads 41326181 Development, Optimization and Characterization of Gastroretentive Multiparticulate Drug Delivery System
Authors: Swapnila V. Vanshiv, Hemant P. Joshi, Atul B. Aware
Abstract:
Current study illustrates the formulation of floating microspheres for purpose of gastroretention of Dipyridamole which shows pH dependent solubility, with the highest solubility in acidic pH. The formulation involved hollow microsphere preparation by using solvent evaporation technique. Concentrations of rate controlling polymer, hydrophilic polymer, internal phase ratio, stirring speed were optimized to get desired responses, namely release of Dipyridamole, buoyancy of microspheres, entrapment efficiency of microspheres. In the formulation, the floating microspheres were prepared by using ethyl cellulose as release retardant and HPMC as a low density hydrophilic swellable polymer. Formulated microspheres were evaluated for their physical properties such as particle size and surface morphology by optical microscopy and SEM. Entrapment efficiency, floating behavior and drug release study as well the formulation was evaluated for in vivo gastroretention in rabbits using gamma scintigraphy. Formulation showed 75% drug release up to 10 hr with entrapment efficiency of 91% and 88% buoyancy till 10 hr. Gamma scintigraphic studies revealed that the optimized system was retained in the gastric region (stomach) for a prolonged period i.e. more than 5 hr.Keywords: Dipyridamole microspheres, gastroretention, HPMC, optimization method
Procedia PDF Downloads 38226180 Energy Saving Techniques for MIMO Decoders
Authors: Zhuofan Cheng, Qiongda Hu, Mohammed El-Hajjar, Basel Halak
Abstract:
Multiple-input multiple-output (MIMO) systems can allow significantly higher data rates compared to single-antenna-aided systems. They are expected to be a prominent part of the 5G communication standard. However, these decoders suffer from high power consumption. This work presents a design technique in order to improve the energy efficiency of MIMO systems; this facilitates their use in the next generation of battery-operated communication devices such as mobile phones and tablets. The proposed optimization approach consists of the use of low complexity lattice reduction algorithm in combination with an adaptive VLSI implementation. The proposed design has been realized and verified in 65nm technology. The results show that the proposed design is significantly more energy-efficient than conventional K-best MIMO systems.Keywords: energy, lattice reduction, MIMO, VLSI
Procedia PDF Downloads 32726179 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments
Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo
Abstract:
Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.Keywords: data disorders, quality, healthcare, treatment
Procedia PDF Downloads 43126178 Locomotion Effects of Redundant Degrees of Freedom in Multi-Legged Quadruped Robots
Authors: Hossein Keshavarz, Alejandro Ramirez-Serrano
Abstract:
Energy efficiency and locomotion speed are two key parameters for legged robots; thus, finding ways to improve them are important. This paper proposes a locomotion framework to analyze the energy usage and speed of quadruped robots via a Genetic Algorithm (GA) optimization process. For this, a quadruped robot platform with joint redundancy in its hind legs that we believe will help multi-legged robots improve their speed and energy consumption is used. ContinuO, the quadruped robot of interest, has 14 active degrees of freedom (DoFs), including three DoFs for each front leg, and unlike previously developed quadruped robots, four DoFs for each hind leg. ContinuO aims to realize a cost-effective quadruped robot for real-world scenarios with high speeds and the ability to overcome large obstructions. The proposed framework is used to locomote the robot and analyze its energy consumed at diverse stride lengths and locomotion speeds. The analysis is performed by comparing the obtained results in two modes, with and without the joint redundancy on the robot’s hind legs.Keywords: genetic algorithm optimization, locomotion path planning, quadruped robots, redundant legs
Procedia PDF Downloads 10226177 Big Data and Analytics in Higher Education: An Assessment of Its Status, Relevance and Future in the Republic of the Philippines
Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay
Abstract:
One of the unique challenges provided by the twenty-first century to Philippine higher education is the utilization of Big Data. The higher education system in the Philippines is generating burgeoning amounts of data that contains relevant data that can be used to generate the information and knowledge needed for accurate data-driven decision making. This study examines the status, relevance and future of Big Data and Analytics in Philippine higher education. The insights gained from the study may be relevant to other developing nations similarly situated as the Philippines.Keywords: big data, data analytics, higher education, republic of the philippines, assessment
Procedia PDF Downloads 34726176 The Data-Driven Localized Wave Solution of the Fokas-Lenells Equation Using Physics-Informed Neural Network
Authors: Gautam Kumar Saharia, Sagardeep Talukdar, Riki Dutta, Sudipta Nandy
Abstract:
The physics-informed neural network (PINN) method opens up an approach for numerically solving nonlinear partial differential equations leveraging fast calculating speed and high precession of modern computing systems. We construct the PINN based on a strong universal approximation theorem and apply the initial-boundary value data and residual collocation points to weekly impose initial and boundary conditions to the neural network and choose the optimization algorithms adaptive moment estimation (ADAM) and Limited-memory Broyden-Fletcher-Golfard-Shanno (L-BFGS) algorithm to optimize learnable parameter of the neural network. Next, we improve the PINN with a weighted loss function to obtain both the bright and dark soliton solutions of the Fokas-Lenells equation (FLE). We find the proposed scheme of adjustable weight coefficients into PINN has a better convergence rate and generalizability than the basic PINN algorithm. We believe that the PINN approach to solve the partial differential equation appearing in nonlinear optics would be useful in studying various optical phenomena.Keywords: deep learning, optical soliton, physics informed neural network, partial differential equation
Procedia PDF Downloads 6926175 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 13526174 Data Management and Analytics for Intelligent Grid
Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh
Abstract:
Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.Keywords: data management, analytics, energy data analytics, smart grid, smart utilities
Procedia PDF Downloads 77826173 Green Thumb Engineering - Explainable Artificial Intelligence for Managing IoT Enabled Houseplants
Authors: Antti Nurminen, Avleen Malhi
Abstract:
Significant progress in intelligent systems in combination with exceedingly wide application domains having machine learning as the core technology are usually opaque, non-intuitive, and commonly complex for human users. We use innovative IoT technology which monitors and analyzes moisture, humidity, luminosity and temperature levels to assist end users for optimization of environmental conditions for their houseplants. For plant health monitoring, we construct a system yielding the Normalized Difference Vegetation Index (NDVI), supported by visual validation by users. We run the system for a selected plant, basil, in varying environmental conditions to cater for typical home conditions, and bootstrap our AI with the acquired data. For end users, we implement a web based user interface which provides both instructions and explanations.Keywords: explainable artificial intelligence, intelligent agent, IoT, NDVI
Procedia PDF Downloads 16126172 Influence of Fermentation Conditions on Humic Acids Production by Trichoderma viride Using an Oil Palm Empty Fruit Bunch as the Substrate
Authors: F. L. Motta, M. H. A. Santana
Abstract:
Humic Acids (HA) were produced by a Trichoderma viride strain under submerged fermentation in a medium based on the oil palm Empty Fruit Bunch (EFB) and the main variables of the process were optimized by using response surface methodology. A temperature of 40°C and concentrations of 50g/L EFB, 5.7g/L potato peptone and 0.11g/L (NH4)2SO4 were the optimum levels of the variables that maximize the HA production, within the physicochemical and biological limits of the process. The optimized conditions led to an experimental HA concentration of 428.4±17.5 mg/L, which validated the prediction from the statistical model of 412.0mg/L. This optimization increased about 7–fold the HA production previously reported in the literature. Additionally, the time profiles of HA production and fungal growth confirmed our previous findings that HA production preferably occurs during fungal sporulation. The present study demonstrated that T. viride successfully produced HA via the submerged fermentation of EFB and the process parameters were successfully optimized using a statistics-based response surface model. To the best of our knowledge, the present work is the first report on the optimization of HA production from EFB by a biotechnological process, whose feasibility was only pointed out in previous works.Keywords: empty fruit bunch, humic acids, submerged fermentation, Trichoderma viride
Procedia PDF Downloads 30526171 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive
Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh
Abstract:
Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data
Procedia PDF Downloads 29326170 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects
Authors: Behnam Tavakkol
Abstract:
Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data
Procedia PDF Downloads 21526169 Optimization of Cacao Fermentation in Davao Philippines Using Sustainable Method
Authors: Ian Marc G. Cabugsa, Kim Ryan Won, Kareem Mamac, Manuel Dee, Merlita Garcia
Abstract:
An optimized cacao fermentation technique was developed for the cacao farmers of Davao City Philippines. Cacao samples with weights ranging from 150-250 kilograms were collected from various cacao farms in Davao City and Zamboanga City Philippines. Different fermentation techniques were used starting with design of the sweat box, prefermentation conditionings, number of days for fermentation and number of turns. As the beans are being fermented, its temperature was regularly monitored using a digital thermometer. The resultant cacao beans were assessed using physical and chemical means. For the physical assessment, the bean cut test, bean count tests, and sensory test were used. Quantification of theobromine, caffeine, and antioxidants in the form of equivalent quercetin was used for chemical assessment. Both the theobromine and caffeine were analyzed using HPLC method while the antioxidant was analyzed spectrometrically. To come up with the best fermentation procedure, the different assessment were given priority coefficients wherein the physical tests – taste test, cut, and bean count tests were given priority over the results of the chemical test. The result of the study was an optimized fermentation protocol that is readily adaptable and transferable to any cacao cooperatives or groups in Mindanao or even Philippines as a whole.Keywords: cacao, fermentation, HPLC, optimization, Philippines
Procedia PDF Downloads 44826168 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations
Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa
Abstract:
This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy
Procedia PDF Downloads 20226167 Healthcare Data Mining Innovations
Authors: Eugenia Jilinguirian
Abstract:
In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database
Procedia PDF Downloads 6426166 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering
Authors: Yunus Doğan, Ahmet Durap
Abstract:
Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods
Procedia PDF Downloads 36026165 Contrast Media Effects and Radiation Dose Assessment in Contrast Enhanced Computed Tomography
Authors: Buhari Samaila, Sabiu Abdullahi, Buhari Maidamma
Abstract:
Background: Contrast-enhanced computed tomography (CE-CT) is a technique that uses contrast media to improve image quality and diagnostic accuracy. It is a widely used imaging modality in medical diagnostics, offering high-resolution images for accurate diagnosis. However, concerns regarding the potential adverse effects of contrast media and radiation dose exposure have prompted ongoing investigation and assessment. It is important to assess the effects of contrast media and radiation dose in CE-CT procedures. Objective: This study aims to assess the effects of contrast media and radiation dose in contrast-enhanced computed tomography (CECT) procedures. Methods: A comprehensive review of the literature was conducted to identify studies related to contrast media effects and radiation dose assessment in CECT. Relevant data, including location, type of research, objective, method, findings, conclusion, authors, and year of publications, were extracted, analyzed, and reported. Results: The findings revealed that several studies have investigated the impacts of contrast media and radiation doses in CECT procedures, with iodinated contrast agents being the most commonly employed. Adverse effects associated with contrast media administration were reported, including allergic reactions, nephrotoxicity, and thyroid dysfunction, albeit at relatively low incidence rates. Additionally, radiation dose levels varied depending on the imaging protocol and anatomical region scanned. Efforts to minimize radiation exposure through optimization techniques were evident across studies. Conclusion: Contrast-enhanced computed tomography (CECT) remains an invaluable tool in medical imaging; however, careful consideration of contrast media effects and radiation dose exposure is imperative. Healthcare practitioners should weigh the diagnostic benefits against potential risks, employing strategies to mitigate adverse effects and optimize radiation dose levels for patient safety and effective diagnosis. Further research is warranted to enhance the understanding and management of contrast media effects and radiation dose optimization in CECT procedures.Keywords: CT, contrast media, radiation dose, effect of radiation
Procedia PDF Downloads 1826164 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm
Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang
Abstract:
In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm
Procedia PDF Downloads 14926163 Optimal Design of InGaP/GaAs Heterojonction Solar Cell
Authors: Djaafar F., Hadri B., Bachir G.
Abstract:
We studied mainly the influence of temperature, thickness, molar fraction and the doping of the various layers (emitter, base, BSF and window) on the performances of a photovoltaic solar cell. In a first stage, we optimized the performances of the InGaP/GaAs dual-junction solar cell while varying its operation temperature from 275°K to 375 °K with an increment of 25°C using a virtual wafer fabrication TCAD Silvaco. The optimization at 300°K led to the following result Icc =14.22 mA/cm2, Voc =2.42V, FF =91.32 %, η = 22.76 % which is close with those found in the literature. In a second stage ,we have varied the molar fraction of different layers as well their thickness and the doping of both emitters and bases and we have registered the result of each variation until obtaining an optimal efficiency of the proposed solar cell at 300°K which was of Icc=14.35mA/cm2,Voc=2.47V,FF=91.34,and η =23.33% for In(1-x)Ga(x)P molar fraction( x=0.5).The elimination of a layer BSF on the back face of our cell, enabled us to make a remarkable improvement of the short-circuit current (Icc=14.70 mA/cm2) and a decrease in open circuit voltage Voc and output η which reached 1.46V and 11.97% respectively. Therefore, we could determine the critical parameters of the cell and optimize its various technological parameters to obtain the best performance for a dual junction solar cell. This work opens the way with new prospects in the field of the photovoltaic one. Such structures will thus simplify the manufacturing processes of the cells; will thus reduce the costs while producing high outputs of photovoltaic conversion.Keywords: modeling, simulation, multijunction, optimization, silvaco ATLAS
Procedia PDF Downloads 61726162 Solution to Increase the Produced Power in Micro-Hydro Power Plant
Authors: Radu Pop, Adrian Bot, Vasile Rednic, Emil Bruj, Oana Raita, Liviu Vaida
Abstract:
Our research presents a study concerning optimization of water flow capture for micro-hydro power plants in order to increase the energy production. It is known that the fish ladder whole, were the water is capture is fix, and the water flow may vary with the river flow, this means that on the fish ladder we will have different servitude flows, sometimes more than needed. We propose to demonstrate that the ‘winter intake’ from micro-hydro power plant, could be automated with an intelligent system which is capable to read some imposed data and adjust the flow in to the needed value. With this automation concept, we demonstrate that the performance of the micro-hydro power plant could increase, in some flow operating regimes, with approx. 10%.Keywords: energy, micro-hydro, water intake, fish ladder
Procedia PDF Downloads 23226161 Improve Closed Loop Performance and Control Signal Using Evolutionary Algorithms Based PID Controller
Authors: Mehdi Shahbazian, Alireza Aarabi, Mohsen Hadiyan
Abstract:
Proportional-Integral-Derivative (PID) controllers are the most widely used controllers in industry because of its simplicity and robustness. Different values of PID parameters make different step response, so an increasing amount of literature is devoted to proper tuning of PID controllers. The problem merits further investigation as traditional tuning methods make large control signal that can damages the system but using evolutionary algorithms based tuning methods improve the control signal and closed loop performance. In this paper three tuning methods for PID controllers have been studied namely Ziegler and Nichols, which is traditional tuning method and evolutionary algorithms based tuning methods, that are, Genetic algorithm and particle swarm optimization. To examine the validity of PSO and GA tuning methods a comparative analysis of DC motor plant is studied. Simulation results reveal that evolutionary algorithms based tuning method have improved control signal amplitude and quality factors of the closed loop system such as rise time, integral absolute error (IAE) and maximum overshoot.Keywords: evolutionary algorithm, genetic algorithm, particle swarm optimization, PID controller
Procedia PDF Downloads 47926160 A Digital Filter for Symmetrical Components Identification
Authors: Khaled M. El-Naggar
Abstract:
This paper presents a fast and efficient technique for monitoring and supervising power system disturbances generated due to dynamic performance of power systems or faults. Monitoring power system quantities involve monitoring fundamental voltage, current magnitudes, and their frequencies as well as their negative and zero sequence components under different operating conditions. The proposed technique is based on simulated annealing optimization technique (SA). The method uses digital set of measurements for the voltage or current waveforms at power system bus to perform the estimation process digitally. The algorithm is tested using different simulated data to monitor the symmetrical components of power system waveforms. Different study cases are considered in this work. Effects of number of samples, sampling frequency and the sample window size are studied. Results are reported and discussed.Keywords: estimation, faults, measurement, symmetrical components
Procedia PDF Downloads 46426159 Optimization of Alkali Assisted Microwave Pretreatments of Sorghum Straw for Efficient Bioethanol Production
Authors: Bahiru Tsegaye, Chandrajit Balomajumder, Partha Roy
Abstract:
The limited supply and related negative environmental consequence of fossil fuels are driving researcher for finding sustainable sources of energy. Lignocellulose biomass like sorghum straw is considered as among cheap, renewable and abundantly available sources of energy. However, lignocellulose biomass conversion to bioenergy like bioethanol is hindered due to the reluctant nature of lignin in the biomass. Therefore, removal of lignin is a vital step for lignocellulose conversion to renewable energy. The aim of this study is to optimize microwave pretreatment conditions using design expert software to remove lignin and to release maximum possible polysaccharides from sorghum straw for efficient hydrolysis and fermentation process. Sodium hydroxide concentration between 0.5-1.5%, v/v, pretreatment time from 5-25 minutes and pretreatment temperature from 120-2000C were considered to depolymerize sorghum straw. The effect of pretreatment was studied by analyzing the compositional changes before and after pretreatments following renewable energy laboratory procedure. Analysis of variance (ANOVA) was used to test the significance of the model used for optimization. About 32.8%-48.27% of hemicellulose solubilization, 53% -82.62% of cellulose release, and 49.25% to 78.29% lignin solubilization were observed during microwave pretreatment. Pretreatment for 10 minutes with alkali concentration of 1.5% and temperature of 1400C released maximum cellulose and lignin. At this optimal condition, maximum of 82.62% of cellulose release and 78.29% of lignin removal was achieved. Sorghum straw at optimal pretreatment condition was subjected to enzymatic hydrolysis and fermentation. The efficiency of hydrolysis was measured by analyzing reducing sugars by 3, 5 dinitrisylicylic acid method. Reducing sugars of about 619 mg/g of sorghum straw were obtained after enzymatic hydrolysis. This study showed a significant amount of lignin removal and cellulose release at optimal condition. This enhances the yield of reducing sugars as well as ethanol yield. The study demonstrates the potential of microwave pretreatments for enhancing bioethanol yield from sorghum straw.Keywords: cellulose, hydrolysis, lignocellulose, optimization
Procedia PDF Downloads 26926158 Access to Health Data in Medical Records in Indonesia in Terms of Personal Data Protection Principles: The Limitation and Its Implication
Authors: Anny Retnowati, Elisabeth Sundari
Abstract:
This research aims to elaborate the meaning of personal data protection principles on patient access to health data in medical records in Indonesia and its implications. The method uses normative legal research by examining health law in Indonesia regarding the patient's right to access their health data in medical records. The data will be analysed qualitatively using the interpretation method to elaborate on the limitation of the meaning of personal data protection principles on patients' access to their data in medical records. The results show that patients only have the right to obtain copies of their health data in medical records. There is no right to inspect directly at any time. Indonesian health law limits the principle of patients' right to broad access to their health data in medical records. This restriction has implications for the reduction of personal data protection as part of human rights. This research contribute to show that a limitaion of personal data protection may abuse the human rights.Keywords: access, health data, medical records, personal data, protection
Procedia PDF Downloads 9026157 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises
Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto
Abstract:
The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel
Procedia PDF Downloads 35526156 Bi-objective Network Optimization in Disaster Relief Logistics
Authors: Katharina Eberhardt, Florian Klaus Kaiser, Frank Schultmann
Abstract:
Last-mile distribution is one of the most critical parts of a disaster relief operation. Various uncertainties, such as infrastructure conditions, resource availability, and fluctuating beneficiary demand, render last-mile distribution challenging in disaster relief operations. The need to balance critical performance criteria like response time, meeting demand and cost-effectiveness further complicates the task. The occurrence of disasters cannot be controlled, and the magnitude is often challenging to assess. In summary, these uncertainties create a need for additional flexibility, agility, and preparedness in logistics operations. As a result, strategic planning and efficient network design are critical for an effective and efficient response. Furthermore, the increasing frequency of disasters and the rising cost of logistical operations amplify the need to provide robust and resilient solutions in this area. Therefore, we formulate a scenario-based bi-objective optimization model that integrates pre-positioning, allocation, and distribution of relief supplies extending the general form of a covering location problem. The proposed model aims to minimize underlying logistics costs while maximizing demand coverage. Using a set of disruption scenarios, the model allows decision-makers to identify optimal network solutions to address the risk of disruptions. We provide an empirical case study of the public authorities’ emergency food storage strategy in Germany to illustrate the potential applicability of the model and provide implications for decision-makers in a real-world setting. Also, we conduct a sensitivity analysis focusing on the impact of varying stockpile capacities, single-site outages, and limited transportation capacities on the objective value. The results show that the stockpiling strategy needs to be consistent with the optimal number of depots and inventory based on minimizing costs and maximizing demand satisfaction. The strategy has the potential for optimization, as network coverage is insufficient and relies on very high transportation and personnel capacity levels. As such, the model provides decision support for public authorities to determine an efficient stockpiling strategy and distribution network and provides recommendations for increased resilience. However, certain factors have yet to be considered in this study and should be addressed in future works, such as additional network constraints and heuristic algorithms.Keywords: humanitarian logistics, bi-objective optimization, pre-positioning, last mile distribution, decision support, disaster relief networks
Procedia PDF Downloads 7926155 The Magnification of Early Detect Nutrition Case through Local Potential Utilization in Urban Region, Indonesia
Authors: Oktia Woro Kasmini Handayani, Sri Ratna Rahayu, Efa Nugroho, Bertakalswa Hermawati
Abstract:
The double burden of nutrition problem must be faced by Indonesia as developing country. The implemented program did not improve the nutritional status, therefore need to consider to utilize local potential. The objective of this research was to find out the effectivity of magnification model of early detect through local potential utilization in urban region, Semarang, Central Java, Indonesia. The research used an experimental design with the quantitative-qualitative approach. The population was all toddlers under five within the research region, sample determination by purposive sampling, as many as 216 toddlers. Quantitative data analysis used effectively criteria by Sugiono. Qualitative data was analyzed using NVivo. The optimization of local potential in the effort of nutrition status improvement shows number of nutrition case found was increased 225% (very effective), number of cases treated was increased 175% (very effective), number of cases counselled was increased 200% (effective), and number of cases that have improvement increase 75% (effective). The local potential need to be utilized in the effort of nutrition program improvement one of it is through the community empowerment, particularly health care and health high education institution as partner.Keywords: early detection, nutrition status, local potential, health cadre
Procedia PDF Downloads 27226154 Intelligent Control of Doubly Fed Induction Generator Wind Turbine for Smart Grid
Authors: Amal A. Hassan, Faten H. Fahmy, Abd El-Shafy A. Nafeh, Hosam K. M. Youssef
Abstract:
Due to the growing penetration of wind energy into the power grid, it is very important to study its interactions with the power system and to provide good control technique in order to deliver high quality power. In this paper, an intelligent control methodology is proposed for optimizing the controllers’ parameters of doubly fed induction generator (DFIG) based wind turbine generation system (WTGS). The genetic algorithm (GA) and particle swarm optimization (PSO) are employed and compared for the parameters adaptive tuning of the proposed proportional integral (PI) multiple controllers of the back to back converters of the DFIG based WTGS. For this purpose, the dynamic model of WTGS with DFIG and its associated controllers is presented. Furthermore, the simulation of the system is performed using MATLAB/SIMULINK and SIMPOWERSYSTEM toolbox to illustrate the performance of the optimized controllers. Finally, this work is validated to 33-bus test radial system to show the interaction between wind distributed generation (DG) systems and the distribution network.Keywords: DFIG wind turine, intelligent control, distributed generation, particle swarm optimization, genetic algorithm
Procedia PDF Downloads 265