Search results for: data transfer optimization
26861 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain
Authors: Bita Payami-Shabestari, Dariush Eslami
Abstract:
The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory
Procedia PDF Downloads 12926860 Assessment of Modern RANS Models for the C3X Vane Film Cooling Prediction
Authors: Mikhail Gritskevich, Sebastian Hohenstein
Abstract:
The paper presents the results of a detailed assessment of several modern Reynolds Averaged Navier-Stokes (RANS) turbulence models for prediction of C3X vane film cooling at various injection regimes. Three models are considered, namely the Shear Stress Transport (SST) model, the modification of the SST model accounting for the streamlines curvature (SST-CC), and the Explicit Algebraic Reynolds Stress Model (EARSM). It is shown that all the considered models face with a problem in prediction of the adiabatic effectiveness in the vicinity of the cooling holes; however, accounting for the Reynolds stress anisotropy within the EARSM model noticeably increases the solution accuracy. On the other hand, further downstream all the models provide a reasonable agreement with the experimental data for the adiabatic effectiveness and among the considered models the most accurate results are obtained with the use EARMS.Keywords: discrete holes film cooling, Reynolds Averaged Navier-Stokes (RANS), Reynolds stress tensor anisotropy, turbulent heat transfer
Procedia PDF Downloads 42026859 Nanobiosensor System for Aptamer Based Pathogen Detection in Environmental Waters
Authors: Nimet Yildirim Tirgil, Ahmed Busnaina, April Z. Gu
Abstract:
Environmental waters are monitored worldwide to protect people from infectious diseases primarily caused by enteric pathogens. All long, Escherichia coli (E. coli) is a good indicator for potential enteric pathogens in waters. Thus, a rapid and simple detection method for E. coli is very important to predict the pathogen contamination. In this study, to the best of our knowledge, as the first time we developed a rapid, direct and reusable SWCNTs (single walled carbon nanotubes) based biosensor system for sensitive and selective E. coli detection in water samples. We use a novel and newly developed flexible biosensor device which was fabricated by high-rate nanoscale offset printing process using directed assembly and transfer of SWCNTs. By simple directed assembly and non-covalent functionalization, aptamer (biorecognition element that specifically distinguish the E. coli O157:H7 strain from other pathogens) based SWCNTs biosensor system was designed and was further evaluated for environmental applications with simple and cost-effective steps. The two gold electrode terminals and SWCNTs-bridge between them allow continuous resistance response monitoring for the E. coli detection. The detection procedure is based on competitive mode detection. A known concentration of aptamer and E. coli cells were mixed and after a certain time filtered. The rest of free aptamers injected to the system. With hybridization of the free aptamers and their SWCNTs surface immobilized probe DNA (complementary-DNA for E. coli aptamer), we can monitor the resistance difference which is proportional to the amount of the E. coli. Thus, we can detect the E. coli without injecting it directly onto the sensing surface, and we could protect the electrode surface from the aggregation of target bacteria or other pollutants that may come from real wastewater samples. After optimization experiments, the linear detection range was determined from 2 cfu/ml to 10⁵ cfu/ml with higher than 0.98 R² value. The system was regenerated successfully with 5 % SDS solution over 100 times without any significant deterioration of the sensor performance. The developed system had high specificity towards E. coli (less than 20 % signal with other pathogens), and it could be applied to real water samples with 86 to 101 % recovery and 3 to 18 % cv values (n=3).Keywords: aptamer, E. coli, environmental detection, nanobiosensor, SWCTs
Procedia PDF Downloads 19726858 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 13526857 Assessing Carbon Stock and Sequestration of Reforestation Species on Old Mining Sites in Morocco Using the DNDC Model
Authors: Nabil Elkhatri, Mohamed Louay Metougui, Ngonidzashe Chirinda
Abstract:
Mining activities have left a legacy of degraded landscapes, prompting urgent efforts for ecological restoration. Reforestation holds promise as a potent tool to rehabilitate these old mining sites, with the potential to sequester carbon and contribute to climate change mitigation. This study focuses on evaluating the carbon stock and sequestration potential of reforestation species in the context of Morocco's mining areas, employing the DeNitrification-DeComposition (DNDC) model. The research is grounded in recognizing the need to connect theoretical models with practical implementation, ensuring that reforestation efforts are informed by accurate and context-specific data. Field data collection encompasses growth patterns, biomass accumulation, and carbon sequestration rates, establishing an empirical foundation for the study's analyses. By integrating the collected data with the DNDC model, the study aims to provide a comprehensive understanding of carbon dynamics within reforested ecosystems on old mining sites. The major findings reveal varying sequestration rates among different reforestation species, indicating the potential for species-specific optimization of reforestation strategies to enhance carbon capture. This research's significance lies in its potential to contribute to sustainable land management practices and climate change mitigation strategies. By quantifying the carbon stock and sequestration potential of reforestation species, the study serves as a valuable resource for policymakers, land managers, and practitioners involved in ecological restoration and carbon management. Ultimately, the study aligns with global objectives to rejuvenate degraded landscapes while addressing pressing climate challenges.Keywords: carbon stock, carbon sequestration, DNDC model, ecological restoration, mining sites, Morocco, reforestation, sustainable land management.
Procedia PDF Downloads 7626856 Optimal 3D Deployment and Path Planning of Multiple Uavs for Maximum Coverage and Autonomy
Authors: Indu Chandran, Shubham Sharma, Rohan Mehta, Vipin Kizheppatt
Abstract:
Unmanned aerial vehicles are increasingly being explored as the most promising solution to disaster monitoring, assessment, and recovery. Current relief operations heavily rely on intelligent robot swarms to capture the damage caused, provide timely rescue, and create road maps for the victims. To perform these time-critical missions, efficient path planning that ensures quick coverage of the area is vital. This study aims to develop a technically balanced approach to provide maximum coverage of the affected area in a minimum time using the optimal number of UAVs. A coverage trajectory is designed through area decomposition and task assignment. To perform efficient and autonomous coverage mission, solution to a TSP-based optimization problem using meta-heuristic approaches is designed to allocate waypoints to the UAVs of different flight capacities. The study exploits multi-agent simulations like PX4-SITL and QGroundcontrol through the ROS framework and visualizes the dynamics of UAV deployment to different search paths in a 3D Gazebo environment. Through detailed theoretical analysis and simulation tests, we illustrate the optimality and efficiency of the proposed methodologies.Keywords: area coverage, coverage path planning, heuristic algorithm, mission monitoring, optimization, task assignment, unmanned aerial vehicles
Procedia PDF Downloads 21526855 Big Data Analysis with RHadoop
Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim
Abstract:
It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop
Procedia PDF Downloads 43726854 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: data augmentation, mutex task generation, meta-learning, text classification.
Procedia PDF Downloads 9426853 3D Remote Sensing Images Parallax Refining Based On HTML5
Authors: Qian Pei, Hengjian Tong, Weitao Chen, Hai Wang, Yanrong Feng
Abstract:
Horizontal parallax is the foundation of stereoscopic viewing. However, the human eye will feel uncomfortable and it will occur diplopia if horizontal parallax is larger than eye separation. Therefore, we need to do parallax refining before conducting stereoscopic observation. Although some scholars have been devoted to online remote sensing refining, the main work of image refining is completed on the server side. There will be a significant delay when multiple users access the server at the same time. The emergence of HTML5 technology in recent years makes it possible to develop rich browser web application. Authors complete the image parallax refining on the browser side based on HTML5, while server side only need to transfer image data and parallax file to browser side according to the browser’s request. In this way, we can greatly reduce the server CPU load and allow a large number of users to access server in parallel and respond the user’s request quickly.Keywords: 3D remote sensing images, parallax, online refining, rich browser web application, HTML5
Procedia PDF Downloads 46126852 Transformer Fault Diagnostic Predicting Model Using Support Vector Machine with Gradient Decent Optimization
Authors: R. O. Osaseri, A. R. Usiobaifo
Abstract:
The power transformer which is responsible for the voltage transformation is of great relevance in the power system and oil-immerse transformer is widely used all over the world. A prompt and proper maintenance of the transformer is of utmost importance. The dissolved gasses content in power transformer, oil is of enormous importance in detecting incipient fault of the transformer. There is a need for accurate prediction of the incipient fault in transformer oil in order to facilitate the prompt maintenance and reducing the cost and error minimization. Study on fault prediction and diagnostic has been the center of many researchers and many previous works have been reported on the use of artificial intelligence to predict incipient failure of transformer faults. In this study machine learning technique was employed by using gradient decent algorithms and Support Vector Machine (SVM) in predicting incipient fault diagnosis of transformer. The method focuses on creating a system that improves its performance on previous result and historical data. The system design approach is basically in two phases; training and testing phase. The gradient decent algorithm is trained with a training dataset while the learned algorithm is applied to a set of new data. This two dataset is used to prove the accuracy of the proposed model. In this study a transformer fault diagnostic model based on Support Vector Machine (SVM) and gradient decent algorithms has been presented with a satisfactory diagnostic capability with high percentage in predicting incipient failure of transformer faults than existing diagnostic methods.Keywords: diagnostic model, gradient decent, machine learning, support vector machine (SVM), transformer fault
Procedia PDF Downloads 32226851 Crude Oil Electrostatic Mathematical Modelling on an Existing Industrial Plant
Authors: Fatemeh Yazdanmehr, Iulian Nistor
Abstract:
The scope of the current study is the prediction of water separation in a two-stage industrial crude oil desalting plant. This research study was focused on developing a desalting operation in an existing production unit of one Iranian heavy oil field with 75 MBPD capacity. Because of some operational issues, such as oil dehydration at high temperatures, the optimization of the desalter operational parameters was essential. The mathematical desalting is modeled based on the population balance method. The existing operational data is used for tuning and validation of the accuracy of the modeling. The inlet oil temperature to desalter used was decreased from 110°C to 80°C, and the desalted electrical field was increased from 0.75 kv to 2.5 kv. The proposed condition for the desalter also meets the water oil specification. Based on these conditions of desalter, the oil recovery is increased by 574 BBL/D, and the gas flaring decrease by 2.8 MMSCF/D. Depending on the oil price, the additional production of oil can increase the annual income by about $15 MM and reduces greenhouse gas production caused by gas flaring.Keywords: desalter, demulsification, modelling, water-oil separation, crude oil emulsion
Procedia PDF Downloads 7626850 Monomial Form Approach to Rectangular Surface Modeling
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications
Procedia PDF Downloads 14626849 Deciphering Electrochemical and Optical Properties of Folic Acid for the Applications of Tissue Engineering and Biofuel Cell
Authors: Sharda Nara, Bansi Dhar Malhotra
Abstract:
Investigation of the vitamins as an electron transfer mediator could significantly assist in merging the area of tissue engineering and electronics required for the implantable therapeutic devices. The present study report that the molecules of folic acid released by Providencia rettgeri via fermentation route under the anoxic condition of the microbial fuel cell (MFC) exhibit characteristic electrochemical and optical properties, as indicated by absorption spectroscopy, photoluminescence (PL), and cyclic voltammetry studies. The absorption spectroscopy has depicted an absorption peak at 263 nm with a small bulge around 293 nm on day two of bacterial culture, whereas an additional peak was observed at 365 nm on the twentieth day. Furthermore, the PL spectra has indicated that the maximum emission occurred at various wavelengths 420, 425, 440, and 445 nm when excited by 310, 325, 350, and 365 nm. The change of emission spectra with varying excitation wavelength might be indicating the presence of tunable optical bands in the folic acid molecules co-related with the redox activity of the molecules. The results of cyclic voltammetry studies revealed that the oxidation and reduction occurred at 0.25V and 0.12V, respectively, indicating the electrochemical behavior of the folic acid. This could be inferred that the released folic acid molecules in a MFC might undergo inter as well as intra molecular electron transfer forming different intermediate states while transferring electrons to the electrode surface. Synchronization of electrochemical and optical properties of folic acid molecules could be potentially promising for the designing of electroactive scaffold and biocompatible conductive surface for the applications of tissue engineering and biofuel cells, respectively.Keywords: biofuel cell, electroactivity, folic acid, tissue engineering
Procedia PDF Downloads 13126848 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network
Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan
Abstract:
Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.Keywords: aggregation point, data communication, data aggregation, wireless sensor network
Procedia PDF Downloads 15826847 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data
Procedia PDF Downloads 59326846 Optimization of Lercanidipine Nanocrystals Using Design of Experiments Approach
Authors: Dolly Gadhiya, Jayvadan Patel, Mihir Raval
Abstract:
Lercanidipine hydrochloride is a calcium channel blockers used for treating angina pectoris and hypertension. Lercanidipine is a BCS Class II drug having poor aqueous solubility. Absolute bioavailability of Lercanidipine is very low and the main reason ascribed for this is poor aqueous solubility of the drug. Design and formulatation of nanocrystals by media milling method was main focus of this study. In this present study preliminary optimization was carried out with one factor at a time (OFAT) approach. For this different parameters like size of milling beads, amount of zirconium beads, types of stabilizer, concentrations of stabilizer, concentrations of drug, stirring speeds and milling time were optimized on the basis of particle size, polydispersity index and zeta potential. From the OFAT model different levels for above parameters selected for Plackett - Burman Design (PBD). Plackett-Burman design having 13 runs involving 6 independent variables was carried out at higher and lower level. Based on statistical analysis of PBD it was found that concentration of stabilizer, concentration of drug and stirring speed have significant impact on particle size, PDI, zeta potential value and saturation solubility. These experimental designs for preparation of nanocrystals were applied successfully which shows increase in aqueous solubility and dissolution rate of Lercanidipine hydrochloride.Keywords: Lercanidipine hydrochloride, nanocrystals, OFAT, Plackett Burman
Procedia PDF Downloads 20626845 A NoSQL Based Approach for Real-Time Managing of Robotics's Data
Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir
Abstract:
This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.Keywords: NoSQL databases, database management systems, robotics, big data
Procedia PDF Downloads 35426844 Recovery of Fried Soybean Oil Using Bentonite as an Adsorbent: Optimization, Isotherm and Kinetics Studies
Authors: Prakash Kumar Nayak, Avinash Kumar, Uma Dash, Kalpana Rayaguru
Abstract:
Soybean oil is one of the most widely consumed cooking oils, worldwide. Deep-fat frying of foods at higher temperatures adds unique flavour, golden brown colour and crispy texture to foods. But it brings in various changes like hydrolysis, oxidation, hydrogenation and thermal alteration to oil. The presence of Peroxide value (PV) is one of the most important factors affecting the quality of the deep-fat fried oil. Using bentonite as an adsorbent, the PV can be reduced, thereby improving the quality of the soybean oil. In this study, operating parameters like heating time of oil (10, 15, 20, 25 & 30 h), contact time ( 5, 10, 15, 20, 25 h) and concentration of adsorbent (0.25, 0.5, 0.75, 1.0 and 1.25 g/ 100 ml of oil) have been optimized by response surface methodology (RSM) considering percentage reduction of PV as a response. Adsorption data were analysed by fitting with Langmuir and Freundlich isotherm model. The results show that the Langmuir model shows the best fit compared to the Freundlich model. The adsorption process was also found to follow a pseudo-second-order kinetic model.Keywords: bentonite, Langmuir isotherm, peroxide value, RSM, soybean oil
Procedia PDF Downloads 37526843 Experimental and Finite Element Analysis for Mechanics of Soil-Tool Interaction
Authors: A. Armin, R. Fotouhi, W. Szyszkowski
Abstract:
In this paper a 3-D finite element (FE) investigation of soil-blade interaction is described. The effects of blade’s shape and rake angle are examined both numerically and experimentally. The soil is considered as an elastic-plastic granular material with non-associated Drucker-Prager material model. Contact elements with different properties are used to mimic soil-blade sliding and soil-soil cutting phenomena. A separation criterion is presented and a procedure to evaluate the forces acting on the blade is given and discussed in detail. Experimental results were derived from tests using soil bin facility and instruments at the University of Saskatchewan. During motion of the blade, load cells collect data and send them to a computer. The measured forces using load cells had noisy signals which are needed to be filtered. The FE results are compared with experimental results for verification. This technique can be used in blade shape optimization and design of more complicated blade’s shape.Keywords: finite element analysis, experimental results, blade force, soil-blade contact modeling
Procedia PDF Downloads 32026842 Design and Implementation of an AI-Enabled Task Assistance and Management System
Authors: Arun Prasad Jaganathan
Abstract:
In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper introduces an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.Keywords: artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization
Procedia PDF Downloads 5926841 Luminescent and Conductive Cathode Buffer Layer for Enhanced Power Conversion Efficiency of Bulk-Heterojunction Solar Cells
Authors: Swati Bishnoi, D. Haranath, Vinay Gupta
Abstract:
In this work, we demonstrate that the power conversion efficiency (PCE) of organic solar cells (OSCs) could be improved significantly by using ZnO doped with Aluminum (Al) and Europium (Eu) as cathode buffer layer (CBL). The ZnO:Al,Eu nanoparticle layer has broadband absorption in the ultraviolet (300-400 nm) region. The Al doping contributes to the enhancement in the conductivity whereas Eu doping significantly improves emission in the visible region. Moreover, this emission overlaps with the absorption range of polymer poly [N -9′-heptadecanyl-2,7-carbazole-alt-5,5-(4′,7′-di-2-thienyl-2′,1′,3′- benzothiadiazole)] (PCDTBT) significantly and results in an enhanced absorption by the active layer and hence high photocurrent. An increase in the power conversion efficiency (PCE) of 6.8% has been obtained for ZnO: Al,Eu CBL as compared to 5.9% for pristine ZnO, in the inverted device configuration ITO/CBL/active layer/MoOx/Al. The active layer comprises of a blend of PCDTBT donor and [6-6]-phenyl C71 butyric acid methyl ester (PC71BM) acceptor. In the reference device pristine ZnO has been used as CBL, whereas in the other one ZnO:Al,Eu has been used as CBL. The role of the luminescent CBL layer is to down-shift the UV light into visible range which overlaps with the absorption of PCDTBT polymer, resulting in an energy transfer from ZnO:Al,Eu to PCDTBT polymer and the absorption by active layer is enhanced as revealed by transient spectroscopy. This enhancement resulted in an increase in the short circuit current which contributes in an increased PCE in the device employing ZnO: Al,Eu CBL. Thus, the luminescent ZnO: Al, Eu nanoparticle CBL has great potential in organic solar cells.Keywords: cathode buffer layer, energy transfer, organic solar cell, power conversion efficiency
Procedia PDF Downloads 25626840 Oil Producing Wells Using a Technique of Gas Lift on Prosper Software
Authors: Nikhil Yadav, Shubham Verma
Abstract:
Gas lift is a common technique used to optimize oil production in wells. Prosper software is a powerful tool for modeling and optimizing gas lift systems in oil wells. This review paper examines the effectiveness of Prosper software in optimizing gas lift systems in oil-producing wells. The literature review identified several studies that demonstrated the use of Prosper software to adjust injection rate, depth, and valve characteristics to optimize gas lift system performance. The results showed that Prosper software can significantly improve production rates and reduce operating costs in oil-producing wells. However, the accuracy of the model depends on the accuracy of the input data, and the cost of Prosper software can be high. Therefore, further research is needed to improve the accuracy of the model and evaluate the cost-effectiveness of using Prosper software in gas lift system optimizationKeywords: gas lift, prosper software, injection rate, operating costs, oil-producing wells
Procedia PDF Downloads 8826839 The Spectroscopic, Molecular Structure and Electrostatic Potential, Polarizability, Hyperpolarizability, and HOMO–LUMO Analysis of Monomeric and Dimeric Structures of N-(2-Methylphenyl)-2-Nitrobenzenesulfonamide
Authors: A. Didaoui, N. Benhalima, M. Elkeurti, A. Chouaih, F. Hamzaoui
Abstract:
The monomer and dimer structures of the title molecule have been obtained from density functional theory (DFT) B3LYP method with 6-31G(d,p) as basis set calculations. The optimized geometrical parameters obtained by B3LYP/6-31G(d,p) method show good agreement with experimental X-ray data. The polarizability and first order hyperpolarizability of the title molecule were calculated and interpreted. The intermolecular N–H•••O hydrogen bonds are discussed in dimer structure of the molecule. The vibrational wave numbers and their assignments were examined theoretically using the Gaussian 03 set of quantum chemistry codes. The predicted frontier molecular orbital energies at B3LYP/6-31G(d,p) method set show that charge transfer occurs within the molecule. The frontier molecular orbital calculations clearly show the inverse relationship of HOMO–LUMO gap with the total static hyperpolarizability. The results also show that N-(2-Methylphenyl)-2-nitrobenzenesulfonamide molecule may have nonlinear optical (NLO) comportment with non-zero values.Keywords: DFT, Gaussian 03, NLO, N-(2-Methylphenyl)-2-nitrobenzenesulfonamide, polarizability
Procedia PDF Downloads 32526838 Credit Risk Evaluation Using Genetic Programming
Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira
Abstract:
Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.Keywords: credit risk assessment, rule generation, genetic programming, feature selection
Procedia PDF Downloads 35326837 Real-Time Generative Architecture for Mesh and Texture
Abstract:
In the evolving landscape of physics-based machine learning (PBML), particularly within fluid dynamics and its applications in electromechanical engineering, robot vision, and robot learning, achieving precision and alignment with researchers' specific needs presents a formidable challenge. In response, this work proposes a methodology that integrates neural transformation with a modified smoothed particle hydrodynamics model for generating transformed 3D fluid simulations. This approach is useful for nanoscale science, where the unique and complex behaviors of viscoelastic medium demand accurate neurally-transformed simulations for materials understanding and manipulation. In electromechanical engineering, the method enhances the design and functionality of fluid-operated systems, particularly microfluidic devices, contributing to advancements in nanomaterial design, drug delivery systems, and more. The proposed approach also aligns with the principles of PBML, offering advantages such as multi-fluid stylization and consistent particle attribute transfer. This capability is valuable in various fields where the interaction of multiple fluid components is significant. Moreover, the application of neurally-transformed hydrodynamical models extends to manufacturing processes, such as the production of microelectromechanical systems, enhancing efficiency and cost-effectiveness. The system's ability to perform neural transfer on 3D fluid scenes using a deep learning algorithm alongside physical models further adds a layer of flexibility, allowing researchers to tailor simulations to specific needs across scientific and engineering disciplines.Keywords: physics-based machine learning, robot vision, robot learning, hydrodynamics
Procedia PDF Downloads 6626836 Modeling Activity Pattern Using XGBoost for Mining Smart Card Data
Authors: Eui-Jin Kim, Hasik Lee, Su-Jin Park, Dong-Kyu Kim
Abstract:
Smart-card data are expected to provide information on activity pattern as an alternative to conventional person trip surveys. The focus of this study is to propose a method for training the person trip surveys to supplement the smart-card data that does not contain the purpose of each trip. We selected only available features from smart card data such as spatiotemporal information on the trip and geographic information system (GIS) data near the stations to train the survey data. XGboost, which is state-of-the-art tree-based ensemble classifier, was used to train data from multiple sources. This classifier uses a more regularized model formalization to control the over-fitting and show very fast execution time with well-performance. The validation results showed that proposed method efficiently estimated the trip purpose. GIS data of station and duration of stay at the destination were significant features in modeling trip purpose.Keywords: activity pattern, data fusion, smart-card, XGboost
Procedia PDF Downloads 24626835 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming
Authors: Hadi Gholizadeh, Ali Tajdin
Abstract:
To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.Keywords: goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression
Procedia PDF Downloads 22326834 Multidimensional Modeling of Solidification Process of Multi-Crystalline Silicon under Magnetic Field for Solar Cell Technology
Authors: Mouhamadou Diop, Mohamed I. Hassan
Abstract:
Molten metallic flow in metallurgical plant is highly turbulent and presents a complex coupling with heat transfer, phase transfer, chemical reaction, momentum transport, etc. Molten silicon flow has significant effect in directional solidification of multicrystalline silicon by affecting the temperature field and the emerging crystallization interface as well as the transport of species and impurities during casting process. Owing to the complexity and limits of reliable measuring techniques, computational models of fluid flow are useful tools to study and quantify these problems. The overall objective of this study is to investigate the potential of a traveling magnetic field for an efficient operating control of the molten metal flow. A multidimensional numerical model will be developed for the calculations of Lorentz force, molten metal flow, and the related phenomenon. The numerical model is implemented in a laboratory-scale silicon crystallization furnace. This study presents the potential of traveling magnetic field approach for an efficient operating control of the molten flow. A numerical model will be used to study the effects of magnetic force applied on the molten flow, and their interdependencies. In this paper, coupled and decoupled, steady and unsteady models of molten flow and crystallization interface will be compared. This study will allow us to retrieve the optimal traveling magnetic field parameter range for crystallization furnaces and the optimal numerical simulations strategy for industrial application.Keywords: multidimensional, numerical simulation, solidification, multicrystalline, traveling magnetic field
Procedia PDF Downloads 24526833 Review of the Model-Based Supply Chain Management Research in the Construction Industry
Authors: Aspasia Koutsokosta, Stefanos Katsavounis
Abstract:
This paper reviews the model-based qualitative and quantitative Operations Management research in the context of Construction Supply Chain Management (CSCM). Construction industry has been traditionally blamed for low productivity, cost and time overruns, waste, high fragmentation and adversarial relationships. The construction industry has been slower than other industries to employ the Supply Chain Management (SCM) concept and develop models that support the decision-making and planning. However the last decade there is a distinct shift from a project-based to a supply-based approach of construction management. CSCM comes up as a new promising management tool of construction operations and improves the performance of construction projects in terms of cost, time and quality. Modeling the Construction Supply Chain (CSC) offers the means to reap the benefits of SCM, make informed decisions and gain competitive advantage. Different modeling approaches and methodologies have been applied in the multi-disciplinary and heterogeneous research field of CSCM. The literature review reveals that a considerable percentage of CSC modeling accommodates conceptual or process models which discuss general management frameworks and do not relate to acknowledged soft OR methods. We particularly focus on the model-based quantitative research and categorize the CSCM models depending on their scope, mathematical formulation, structure, objectives, solution approach, software used and decision level. Although over the last few years there has been clearly an increase of research papers on quantitative CSC models, we identify that the relevant literature is very fragmented with limited applications of simulation, mathematical programming and simulation-based optimization. Most applications are project-specific or study only parts of the supply system. Thus, some complex interdependencies within construction are neglected and the implementation of the integrated supply chain management is hindered. We conclude this paper by giving future research directions and emphasizing the need to develop robust mathematical optimization models for the CSC. We stress that CSC modeling needs a multi-dimensional, system-wide and long-term perspective. Finally, prior applications of SCM to other industries have to be taken into account in order to model CSCs, but not without the consequential reform of generic concepts to match the unique characteristics of the construction industry.Keywords: construction supply chain management, modeling, operations research, optimization, simulation
Procedia PDF Downloads 50326832 Deliberate Learning and Practice: Enhancing Situated Learning Approach in Professional Communication Course
Authors: Susan Lee
Abstract:
Situated learning principles are adopted in the design of the module, professional communication, in its iteration of tasks and assignments to create a learning environment that simulates workplace reality. The success of situated learning is met when students are able to transfer and apply their skills beyond the classroom, in their personal life, and workplace. The learning process should help students recognize the relevance and opportunities for application. In the module’s learning component on negotiation, cases are created based on scenarios inspired by industry practices. The cases simulate scenarios that students on the course may encounter when they enter the workforce when they take on executive roles in the real estate sector. Engaging in the cases has enhanced students’ learning experience as they apply interpersonal communication skills in negotiation contexts of executives. Through the process of case analysis, role-playing, and peer feedback, students are placed in an experiential learning space to think and act in a deliberate manner not only as students but as professionals they will graduate to be. The immersive skills practices enable students to continuously apply a range of verbal and non-verbal communication skills purposefully as they stage their negotiations. The theme in students' feedback resonates with their awareness of the authentic and workplace experiences offered through visceral role-playing. Students also note relevant opportunities for the future transfer of the skills acquired. This indicates that students recognize the possibility of encountering similar negotiation episodes in the real world and realize they possess the negotiation tools and communication skills to deliberately apply them when these opportunities arise outside the classroom.Keywords: deliberate practice, interpersonal communication skills, role-play, situated learning
Procedia PDF Downloads 214