Search results for: grasshopper optimization algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6095

Search results for: grasshopper optimization algorithm

3485 Automatic Content Curation of Visual Heritage

Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz

Abstract:

Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.

Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research

Procedia PDF Downloads 184
3484 Aggregating Buyers and Sellers for E-Commerce: How Demand and Supply Meet in Fairs

Authors: Pierluigi Gallo, Francesco Randazzo, Ignazio Gallo

Abstract:

In recent years, many new and interesting models of successful online business have been developed. Many of these are based on the competition between users, such as online auctions, where the product price is not fixed and tends to rise. Other models, including group-buying, are based on cooperation between users, characterized by a dynamic price of the product that tends to go down. There is not yet a business model in which both sellers and buyers are grouped in order to negotiate on a specific product or service. The present study investigates a new extension of the group-buying model, called fair, which allows aggregation of demand and supply for price optimization, in a cooperative manner. Additionally, our system also aggregates products and destinations for shipping optimization. We introduced the following new relevant input parameters in order to implement a double-side aggregation: (a) price-quantity curves provided by the seller; (b) waiting time, that is, the longer buyers wait, the greater discount they get; (c) payment time, which determines if the buyer pays before, during or after receiving the product; (d) the distance between the place where products are available and the place of shipment, provided in advance by the buyer or dynamically suggested by the system. To analyze the proposed model we implemented a system prototype and a simulator that allows studying effects of changing some input parameters. We analyzed the dynamic price model in fairs having one single seller and a combination of selected sellers. The results are very encouraging and motivate further investigation on this topic.

Keywords: auction, aggregation, fair, group buying, social buying

Procedia PDF Downloads 294
3483 Embedded Digital Image System

Authors: Dawei Li, Cheng Liu, Yiteng Liu

Abstract:

This paper introduces an embedded digital image system for Chinese space environment vertical exploration sounding rocket. In order to record the flight status of the sounding rocket as well as the payloads, an onboard embedded image processing system based on ADV212, a JPEG2000 compression chip, is designed in this paper. Since the sounding rocket is not designed to be recovered, all image data should be transmitted to the ground station before the re-entry while the downlink band used for the image transmission is only about 600 kbps. Under the same condition of compression ratio compared with other algorithm, JPEG2000 standard algorithm can achieve better image quality. So JPEG2000 image compression is applied under this condition with a limited downlink data band. This embedded image system supports lossless to 200:1 real time compression, with two cameras to monitor nose ejection and motor separation, and two cameras to monitor boom deployment. The encoder, ADV7182, receives PAL signal from the camera, then output the ITU-R BT.656 signal to ADV212. ADV7182 switches between four input video channels as the program sequence. Two SRAMs are used for Ping-pong operation and one 512 Mb SDRAM for buffering high frame-rate images. The whole image system has the characteristics of low power dissipation, low cost, small size and high reliability, which is rather suitable for this sounding rocket application.

Keywords: ADV212, image system, JPEG2000, sounding rocket

Procedia PDF Downloads 421
3482 Ultrasound-Assisted Extraction of Carotenoids from Tangerine Peel Using Ostrich Oil as a Green Solvent and Optimization of the Process by Response Surface Methodology

Authors: Fariba Tadayon, Nika Gharahgolooyan, Ateke Tadayon, Mostafa Jafarian

Abstract:

Carotenoid pigments are a various group of lipophilic compounds that generate the yellow to red colors of many plants, foods and flowers. A well-known type of carotenoids which is pro-vitamin A is β-carotene. Due to the color of citrus fruit’s peel, the peel can be a good source of different carotenoids. Ostrich oil is one of the most valuable foundations in many branches of industry, medicine, cosmetics and nutrition. The animal-based ostrich oil could be considered as an alternative and green solvent. Following this study, wastes of citrus peel will recycle by a simple method and extracted carotenoids can increase properties of ostrich oil. In this work, a simple and efficient method for extraction of carotenoids from tangerine peel was designed. Ultrasound-assisted extraction (UAE) showed significant effect on the extraction rate by increasing the mass transfer rate. Ostrich oil can be used as a green solvent in many studies to eliminate petroleum-based solvents. Since tangerine peel is a complex source of different carotenoids separation and determination was performed by high-performance liquid chromatography (HPLC). In addition, the ability of ostrich oil and sunflower oil in carotenoid extraction from tangerine peel and carrot was compared. The highest yield of β-carotene extracted from tangerine peel using sunflower oil and ostrich oil were 75.741 and 88.110 (mg/L), respectively. Optimization of the process was achieved by response surface methodology (RSM) and the optimal extraction conditions were tangerine peel powder particle size of 0.180 mm, ultrasonic intensity of 19 W/cm2 and sonication time of 30 minutes.

Keywords: β-carotene, carotenoids, citrus peel, ostrich oil, response surface methodology, ultrasound-assisted extraction

Procedia PDF Downloads 316
3481 Integrating Optuna And Synthetic Data Generation For Optimized Medical Transcript Classification Using BioBERT

Authors: Sachi Nandan Mohanty, Shreya Sinha, Sweeti Sah, Shweta Sharma

Abstract:

The advancement of natural language processing has majorly influenced the field of medical transcript classification, providing a robust framework for enhancing the accuracy of clinical data processing. It has enormous potential to transform healthcare and improve people's livelihoods. This research focuses on improving the accuracy of medical transcript categorization using Bidirectional Encoder Representations from Transformers (BERT) and its specialized variants, including BioBERT, ClinicalBERT, SciBERT, and BlueBERT. The experimental work employs Optuna, an optimization framework, for hyperparameter tuning to identify the most effective variant, concluding that BioBERT yields the best performance. Furthermore, various optimizers, including Adam, RMSprop, and Layerwise adaptive large batch optimization (LAMB), were evaluated alongside BERT's default AdamW optimizer. The findings show that the LAMB optimizer achieves equally good performance as AdamW. Synthetic data generation techniques from Gretel were utilized to augment the dataset, expanding the original dataset from 5,000 to 10,000 rows. Subsequent evaluations demonstrated that the model maintained its performance with synthetic data, with the LAMB optimizer showing marginally better results. The enhanced dataset and optimized model configurations improved classification accuracy, showcasing the efficacy of the BioBERT variant and the LAMB optimizer. It resulted in an accuracy of up to 98.2% and 90.8% for the original and combined datasets, respectively.

Keywords: BioBERT, clinical data, healthcare AI, transformer models

Procedia PDF Downloads 0
3480 Non-intrusive Hand Control of Drone Using an Inexpensive and Streamlined Convolutional Neural Network Approach

Authors: Evan Lowhorn, Rocio Alba-Flores

Abstract:

The purpose of this work is to develop a method for classifying hand signals and using the output in a drone control algorithm. To achieve this, methods based on Convolutional Neural Networks (CNN) were applied. CNN's are a subset of deep learning, which allows grid-like inputs to be processed and passed through a neural network to be trained for classification. This type of neural network allows for classification via imaging, which is less intrusive than previous methods using biosensors, such as EMG sensors. Classification CNN's operate purely from the pixel values in an image; therefore they can be used without additional exteroceptive sensors. A development bench was constructed using a desktop computer connected to a high-definition webcam mounted on a scissor arm. This allowed the camera to be pointed downwards at the desk to provide a constant solid background for the dataset and a clear detection area for the user. A MATLAB script was created to automate dataset image capture at the development bench and save the images to the desktop. This allowed the user to create their own dataset of 12,000 images within three hours. These images were evenly distributed among seven classes. The defined classes include forward, backward, left, right, idle, and land. The drone has a popular flip function which was also included as an additional class. To simplify control, the corresponding hand signals chosen were the numerical hand signs for one through five for movements, a fist for land, and the universal “ok” sign for the flip command. Transfer learning with PyTorch (Python) was performed using a pre-trained 18-layer residual learning network (ResNet-18) to retrain the network for custom classification. An algorithm was created to interpret the classification and send encoded messages to a Ryze Tello drone over its 2.4 GHz Wi-Fi connection. The drone’s movements were performed in half-meter distance increments at a constant speed. When combined with the drone control algorithm, the classification performed as desired with negligible latency when compared to the delay in the drone’s movement commands.

Keywords: classification, computer vision, convolutional neural networks, drone control

Procedia PDF Downloads 210
3479 Accelerated Structural Reliability Analysis under Earthquake-Induced Tsunamis by Advanced Stochastic Simulation

Authors: Sai Hung Cheung, Zhe Shao

Abstract:

Recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 brought huge losses of lives and properties. Maintaining vertical evacuation systems is the most crucial strategy to effectively reduce casualty during the tsunami event. Thus, it is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability (or its complement failure probability) of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of the Subset Simulation algorithm and a recently proposed moving least squares response surface approach for stochastic sampling is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.

Keywords: response surface model, subset simulation, structural reliability, Tsunami risk

Procedia PDF Downloads 383
3478 Service Flow in Multilayer Networks: A Method for Evaluating the Layout of Urban Medical Resources

Authors: Guanglin Song

Abstract:

(Objective) Situated within the context of China's tiered medical treatment system, this study aims to analyze spatial causes of urban healthcare access difficulties from the perspective of the configuration of healthcare facilities. (Methods) A social network analysis approach is employed to construct a healthcare demand and supply flow network between major residential clusters and various tiers of hospitals in the city.(Conclusion) The findings reveal that:1.there exists overall maldistribution and over-concentration of healthcare resources in Study Area, characterized by structural imbalance; 2.the low rate of primary care utilization in Study Area is a key factor contributing to congestion at higher-tier hospitals, as excessive reliance on these institutions by neighboring communities exacerbates the problem; 3.gradual optimization of the healthcare facility layout in Study Area, encompassing holistic, local, and individual institutional levels, can enhance systemic efficiency and resource balance.(Prospects) This research proposes a method for evaluating urban healthcare resource distribution structures based on service flows within hierarchical networks. It offers spatially targeted optimization suggestions for promoting the implementation of the tiered healthcare system and alleviating challenges related to accessibility and congestion in seeking medical care. Provide some new ideas for researchers and healthcare managers in countries, cities, and healthcare management around the world with similar challenges.

Keywords: flow of public services, urban networks, healthcare facilities, spatial planning, urban networks

Procedia PDF Downloads 68
3477 Modeling Continuous Flow in a Curved Channel Using Smoothed Particle Hydrodynamics

Authors: Indri Mahadiraka Rumamby, R. R. Dwinanti Rika Marthanty, Jessica Sjah

Abstract:

Smoothed particle hydrodynamics (SPH) was originally created to simulate nonaxisymmetric phenomena in astrophysics. However, this method still has several shortcomings, namely the high computational cost required to model values with high resolution and problems with boundary conditions. The difficulty of modeling boundary conditions occurs because the SPH method is influenced by particle deficiency due to the integral of the kernel function being truncated by boundary conditions. This research aims to answer if SPH modeling with a focus on boundary layer interactions and continuous flow can produce quantifiably accurate values with low computational cost. This research will combine algorithms and coding in the main program of meandering river, continuous flow algorithm, and solid-fluid algorithm with the aim of obtaining quantitatively accurate results on solid-fluid interactions with the continuous flow on a meandering channel using the SPH method. This study uses the Fortran programming language for modeling the SPH (Smoothed Particle Hydrodynamics) numerical method; the model is conducted in the form of a U-shaped meandering open channel in 3D, where the channel walls are soil particles and uses a continuous flow with a limited number of particles.

Keywords: smoothed particle hydrodynamics, computational fluid dynamics, numerical simulation, fluid mechanics

Procedia PDF Downloads 132
3476 Remote Assessment and Change Detection of GreenLAI of Cotton Crop Using Different Vegetation Indices

Authors: Ganesh B. Shinde, Vijaya B. Musande

Abstract:

Cotton crop identification based on the timely information has significant advantage to the different implications of food, economic and environment. Due to the significant advantages, the accurate detection of cotton crop regions using supervised learning procedure is challenging problem in remote sensing. Here, classifiers on the direct image are played a major role but the results are not much satisfactorily. In order to further improve the effectiveness, variety of vegetation indices are proposed in the literature. But, recently, the major challenge is to find the better vegetation indices for the cotton crop identification through the proposed methodology. Accordingly, fuzzy c-means clustering is combined with neural network algorithm, trained by Levenberg-Marquardt for cotton crop classification. To experiment the proposed method, five LISS-III satellite images was taken and the experimentation was done with six vegetation indices such as Simple Ratio, Normalized Difference Vegetation Index, Enhanced Vegetation Index, Green Atmospherically Resistant Vegetation Index, Wide-Dynamic Range Vegetation Index, Green Chlorophyll Index. Along with these indices, Green Leaf Area Index is also considered for investigation. From the research outcome, Green Atmospherically Resistant Vegetation Index outperformed with all other indices by reaching the average accuracy value of 95.21%.

Keywords: Fuzzy C-Means clustering (FCM), neural network, Levenberg-Marquardt (LM) algorithm, vegetation indices

Procedia PDF Downloads 318
3475 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees

Authors: Alexandru-Ion Marinescu

Abstract:

There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.

Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution

Procedia PDF Downloads 117
3474 Aerodynamic Analysis by Computational Fluids Dynamics in Building: Case Study

Authors: Javier Navarro Garcia, Narciso Vazquez Carretero

Abstract:

Eurocode 1, part 1-4, wind actions, includes in its article 1.5 the possibility of using numerical calculation methods to obtain information on the loads acting on a building. On the other hand, the analysis using computational fluids dynamics (CFD) in aerospace, aeronautical, and industrial applications is already in widespread use. The application of techniques based on CFD analysis on the building to study its aerodynamic behavior now opens a whole alternative field of possibilities for civil engineering and architecture; optimization of the results with respect to those obtained by applying the regulations, the possibility of obtaining information on pressures, speeds at any point of the model for each moment, the analysis of turbulence and the possibility of modeling any geometry or configuration. The present work compares the results obtained on a building, with respect to its aerodynamic behavior, from a mathematical model based on the analysis by CFD with the results obtained by applying Eurocode1, part1-4, wind actions. It is verified that the results obtained by CFD techniques suppose an optimization of the wind action that acts on the building with respect to the wind action obtained by applying the Eurocode1, part 1-4, wind actions. In order to carry out this verification, a 45m high square base truncated pyramid building has been taken. The mathematical model on CFD, based on finite volumes, has been calculated using the FLUENT commercial computer application using a scale-resolving simulation (SRS) type large eddy simulation (LES) turbulence model for an atmospheric boundary layer wind with turbulent component in the direction of the flow.

Keywords: aerodynamic, CFD, computacional fluids dynamics, computational mechanics

Procedia PDF Downloads 137
3473 Microwave Heating and Catalytic Activity of Iron/Carbon Materials for H₂ Production from the Decomposition of Plastic Wastes

Authors: Peng Zhang, Cai Liang

Abstract:

The non-biodegradable plastic wastes have posed severe environmental and ecological contaminations. Numerous technologies, such as pyrolysis, incineration, and landfilling, have already been employed for the treatment of plastic waste. Compared with conventional methods, microwave has displayed unique advantages in the rapid production of hydrogen from plastic wastes. Understanding the interaction between microwave radiation and materials would promote the optimization of several parameters for the microwave reaction system. In this work, various carbon materials have been investigated to reveal microwave heating performance and the ensuing catalytic activity. Results showed that the diversity in the heating characteristic was mainly due to the dielectric properties and the individual microstructures. Furthermore, the gaps and steps among the surface of carbon materials would lead to the distortion of the electromagnetic field, which correspondingly induced plasma discharging. The intensity and location of local plasma were also studied. For high-yield H₂ production, iron nanoparticles were selected as the active sites, and a series of iron/carbon bifunctional catalysts were synthesized. Apart from the high catalytic activity, the iron particles in nano-size close to the microwave skin depth would transfer microwave irradiation to the heat, intensifying the decomposition of plastics. Under microwave radiation, iron is supported on activated carbon material with 10wt.% loading exhibited the best catalytic activity for H₂ production. Specifically, the plastics were rapidly heated up and subsequently converted into H₂ with a hydrogen efficiency of 85%. This work demonstrated a deep understanding of microwave reaction systems and provided the optimization for plastic treatment.

Keywords: plastic waste, recycling, hydrogen, microwave

Procedia PDF Downloads 71
3472 Sintering of YNbO3:Eu3+ Compound: Correlation between Luminescence and Spark Plasma Sintering Effect

Authors: Veronique Jubera, Ka-Young Kim, U-Chan Chung, Amelie Veillere, Jean-Marc Heintz

Abstract:

Emitting materials and all solid state lasers are widely used in the field of optical applications and materials science as a source of excitement, instrumental measurements, medical applications, metal shaping etc. Recently promising optical efficiencies were recorded on ceramics which result from a cheaper and faster ways to obtain crystallized materials. The choice and optimization of the sintering process is the key point to fabricate transparent ceramics. It includes a high control on the preparation of the powder with the choice of an adequate synthesis, a pre-heat-treatment, the reproducibility of the sintering cycle, the polishing and post-annealing of the ceramic. The densification is the main factor needed to reach a satisfying transparency, and many technologies are now available. The symmetry of the unit cell plays a crucial role in the diffusion rate of the material. Therefore, the cubic symmetry compounds having an isotropic refractive index is preferred. The cubic Y3NbO7 matrix is an interesting host which can accept a high concentration of rare earth doping element and it has been demonstrated that SPS is an efficient way to sinter this material. The optimization of diffusion losses requires a microstructure of fine ceramics, generally less than one hundred nanometers. In this case, grain growth is not an obstacle to transparency. The ceramics properties are then isotropic thereby to free-shaping step by orienting the ceramics as this is the case for the compounds of lower symmetry. After optimization of the synthesis route, several SPS parameters as heating rate, holding, dwell time and pressure were adjusted in order to increase the densification of the Eu3+ doped Y3NbO7 pellets. The luminescence data coupled with X-Ray diffraction analysis and electronic diffraction microscopy highlight the existence of several distorted environments of the doping element in the studied defective fluorite-type host lattice. Indeed, the fast and high crystallization rate obtained to put in evidence a lack of miscibility in the phase diagram, being the final composition of the pellet driven by the ratio between niobium and yttrium elements. By following the luminescence properties, we demonstrate a direct impact on the SPS process on this material.

Keywords: emission, niobate of rare earth, Spark plasma sintering, lack of miscibility

Procedia PDF Downloads 268
3471 Lip Localization Technique for Myanmar Consonants Recognition Based on Lip Movements

Authors: Thein Thein, Kalyar Myo San

Abstract:

Lip reading system is one of the different supportive technologies for hearing impaired, or elderly people or non-native speakers. For normal hearing persons in noisy environments or in conditions where the audio signal is not available, lip reading techniques can be used to increase their understanding of spoken language. Hearing impaired persons have used lip reading techniques as important tools to find out what was said by other people without hearing voice. Thus, visual speech information is important and become active research area. Using visual information from lip movements can improve the accuracy and robustness of a speech recognition system and the need for lip reading system is ever increasing for every language. However, the recognition of lip movement is a difficult task because of the region of interest (ROI) is nonlinear and noisy. Therefore, this paper proposes method to detect the accurate lips shape and to localize lip movement towards automatic lip tracking by using the combination of Otsu global thresholding technique and Moore Neighborhood Tracing Algorithm. Proposed method shows how accurate lip localization and tracking which is useful for speech recognition. In this work of study and experiments will be carried out the automatic lip localizing the lip shape for Myanmar consonants using the only visual information from lip movements which is useful for visual speech of Myanmar languages.

Keywords: lip reading, lip localization, lip tracking, Moore neighborhood tracing algorithm

Procedia PDF Downloads 352
3470 Optimizing Recycling and Reuse Strategies for Circular Construction Materials with Life Cycle Assessment

Authors: Zhongnan Ye, Xiaoyi Liu, Shu-Chien Hsu

Abstract:

Rapid urbanization has led to a significant increase in construction and demolition waste (C&D waste), underscoring the need for sustainable waste management strategies in the construction industry. Aiming to enhance the sustainability of urban construction practices, this study develops an optimization model to effectively suggest the optimal recycling and reuse strategies for C&D waste, including concrete and steel. By employing Life Cycle Assessment (LCA), the model evaluates the environmental impacts of adopted construction materials throughout their lifecycle. The model optimizes the quantity of materials to recycle or reuse, the selection of specific recycling and reuse processes, and logistics decisions related to the transportation and storage of recycled materials with the objective of minimizing the overall environmental impact, quantified in terms of carbon emissions, energy consumption, and associated costs, while adhering to a range of constraints. These constraints include capacity limitations, quality standards for recycled materials, compliance with environmental regulations, budgetary limits, and temporal considerations such as project deadlines and material availability. The strategies are expected to be both cost-effective and environmentally beneficial, promoting a circular economy within the construction sector, aligning with global sustainability goals, and providing a scalable framework for managing construction waste in densely populated urban environments. The model is helpful in reducing the carbon footprint of construction projects, conserving valuable resources, and supporting the industry’s transition towards a more sustainable future.

Keywords: circular construction, construction and demolition waste, material recycling, optimization modeling

Procedia PDF Downloads 57
3469 Isolation, Characterization and Optimization of Alkalophilic and Thermotolerant Lipase from Bacillus subtilis Strain

Authors: Indu Bhushan Sharma, Rashmi Saraswat

Abstract:

The thermotolerant, solvent stable and alkalophilic lipase producing bacterial strain was isolated from the water sample of the foothills of Trikuta Mountain in Kakryal (Reasi district) in Jammu and Kashmir, India. The lipase-producing microorganisms were screened using tributyrin agar plates. The selected microbe was optimized for maximum lipase production by subjecting to various carbon and nitrogen sources, incubation period and inoculum size. The selected strain was identified as Bacillus subtilis strain kakrayal_1 (BSK_1) using 16S rRNA sequence analysis. Effect of pH, temperature, metal ions, detergents and organic solvents were studied on lipase activity. Lipase was found to be stable over a pH range of 6.0 to 9.0 and exhibited maximum activity at pH 8. Lipolytic activity was highest at 37°C and the enzyme activity remained at 60°C for 24hrs, hence, established as thermo-tolerant. Production of lipase was significantly induced by vegetable oil and the best nitrogen source was found to be peptone. The isolated Bacillus lipase was stimulated by pre-treatment with Mn2+, Ca2+, K+, Zn2+, and Fe2+. Lipase was stable in detergents such as triton X 100, tween 20 and Tween 80. The 100% ethyl acetate enhanced lipase activity whereas, lipase activity were found to be stable in Hexane. The optimization resulted in 4 fold increase in lipase production. Bacillus lipases are ‘generally recognized as safe’ (GRAS) and are industrially interesting. The inducible alkaline, thermo-tolerant lipase exhibited the ability to be stable in detergents and organic solvents. This could be further researched as a potential biocatalyst for industrial applications such as biotransformation, detergent formulation, bioremediation and organic synthesis.

Keywords: bacillus, lipase, thermotolerant, alkalophilic

Procedia PDF Downloads 255
3468 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function

Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros

Abstract:

The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.

Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change

Procedia PDF Downloads 174
3467 Real-Time Inventory Management and Operational Efficiency in Manufacturing

Authors: Tom Wanyama

Abstract:

We have developed a weight-based parts inventory monitoring system utilizing the Industrial Internet of Things (IIoT) to enhance operational efficiencies in manufacturing. The system addresses various challenges, including eliminating downtimes caused by stock-outs, preventing human errors in parts delivery and product assembly, and minimizing motion waste by reducing unnecessary worker movements. The system incorporates custom QR codes for simplified inventory tracking and retrieval processes. The generated data serves a dual purpose by enabling real-time optimization of parts flow within manufacturing facilities and facilitating retroactive optimization of stock levels for informed decision-making in inventory management. The pilot implementation at SEPT Learning Factory successfully eradicated data entry errors, optimized parts delivery, and minimized workstation downtimes, resulting in a remarkable increase of over 10% in overall equipment efficiency across all workstations. Leveraging the IIoT features, the system seamlessly integrates information into the process control system, contributing to the enhancement of product quality. This approach underscores the importance of effective tracking of parts inventory in manufacturing to achieve transparency, improved inventory control, and overall profitability. In the broader context, our inventory monitoring system aligns with the evolving focus on optimizing supply chains and maintaining well-managed warehouses to ensure maximum efficiency in the manufacturing industry.

Keywords: industrial Internet of things, industrial systems integration, inventory monitoring, inventory control in manufacturing

Procedia PDF Downloads 36
3466 Large Core Silica Few-Mode Optical Fibers with Reduced Differential Mode Delay and Enhanced Mode Effective Area over 'C'-Band

Authors: Anton V. Bourdine, Vladimir A. Burdin, Oleg R. Delmukhametov

Abstract:

This work presents a fast and simple method for the design of large core silica optical fibers with differential mode delay (DMD) management. Some results are reported concerned with refractive index profile optimization for 42 µm core 16-LP-mode optical fiber for next-generation optical networks. Here special refractive index profile form provides total DMD reducing over all mode staff under desired enhanced mode effective area. Method for the simulation of 'real manufactured' few-mode optical fiber (FMF) core geometry differing from the desired optimized structure by core non-symmetrical ellipticity and refractive index profile deviation including local fluctuations is proposed. Results of the following analysis of optimized FMF with inserted geometry distortions performed by earlier on developed modification of rigorous mixed finite-element method showed strong DMD degradation that requires additional higher-order mode management. In addition, this work also presents a method for design mode division multiplexer channel precision spatial positioning scheme at FMF core end that provides one of the potentiality solutions of described DMD degradation problem concerned with 'distorted' core geometry due to features of optical fiber manufacturing techniques.

Keywords: differential mode delay, few-mode optical fibers, nonlinear Shannon limit, optical fiber non-circularity, ‘real manufactured’ optical fiber core geometry simulation, refractive index profile optimization

Procedia PDF Downloads 157
3465 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties

Authors: G. Martino, F. Silva, E. Marchal

Abstract:

The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.

Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization

Procedia PDF Downloads 123
3464 The Optimization of the Parameters for Eco-Friendly Leaching of Precious Metals from Waste Catalyst

Authors: Silindile Gumede, Amir Hossein Mohammadi, Mbuyu Germain Ntunka

Abstract:

Goal 12 of the 17 Sustainable Development Goals (SDGs) encourages sustainable consumption and production patterns. This necessitates achieving the environmentally safe management of chemicals and all wastes throughout their life cycle and the proper disposal of pollutants and toxic waste. Fluid catalytic cracking (FCC) catalysts are widely used in the refinery to convert heavy feedstocks to lighter ones. During the refining processes, the catalysts are deactivated and discarded as hazardous toxic solid waste. Spent catalysts (SC) contain high-cost metal, and the recovery of metals from SCs is a tactical plan for supplying part of the demand for these substances and minimizing the environmental impacts. Leaching followed by solvent extraction, has been found to be the most efficient method to recover valuable metals with high purity from spent catalysts. However, the use of inorganic acids during the leaching process causes a secondary environmental issue. Therefore, it is necessary to explore other alternative efficient leaching agents that are economical and environmentally friendly. In this study, the waste catalyst was collected from a domestic refinery and was characterised using XRD, ICP, XRF, and SEM. Response surface methodology (RSM) and Box Behnken design were used to model and optimize the influence of some parameters affecting the acidic leaching process. The parameters selected in this investigation were the acid concentration, temperature, and leaching time. From the characterisation results, it was found that the spent catalyst consists of high concentrations of Vanadium (V) and Nickel (Ni); hence this study focuses on the leaching of Ni and V using a biodegradable acid to eliminate the formation of the secondary pollution.

Keywords: eco-friendly leaching, optimization, metal recovery, leaching

Procedia PDF Downloads 68
3463 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 94
3462 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 140
3461 Open-Loop Vector Control of Induction Motor with Space Vector Pulse Width Modulation Technique

Authors: Karchung, S. Ruangsinchaiwanich

Abstract:

This paper presents open-loop vector control method of induction motor with space vector pulse width modulation (SVPWM) technique. Normally, the closed loop speed control is preferred and is believed to be more accurate. However, it requires a position sensor to track the rotor position which is not desirable to use it for certain workspace applications. This paper exhibits the performance of three-phase induction motor with the simplest control algorithm without the use of a position sensor nor an estimation block to estimate rotor position for sensorless control. The motor stator currents are measured and are transformed to synchronously rotating (d-q-axis) frame by use of Clarke and Park transformation. The actual control happens in this frame where the measured currents are compared with the reference currents. The error signal is fed to a conventional PI controller, and the corrected d-q voltage is generated. The controller outputs are transformed back to three phase voltages and are fed to SVPWM block which generates PWM signal for the voltage source inverter. The open loop vector control model along with SVPWM algorithm is modeled in MATLAB/Simulink software and is experimented and validated in TMS320F28335 DSP board.

Keywords: electric drive, induction motor, open-loop vector control, space vector pulse width modulation technique

Procedia PDF Downloads 147
3460 Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation

Authors: Youngsun Moon, Yeong-Ju Go, Jong-Soo Choi

Abstract:

Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel.

Keywords: aeroacoustics, acoustic source detection, time difference of arrival, direction of arrival, blind source separation, independent component analysis, drone

Procedia PDF Downloads 162
3459 Obtaining High-Dimensional Configuration Space for Robotic Systems Operating in a Common Environment

Authors: U. Yerlikaya, R. T. Balkan

Abstract:

In this research, a method is developed to obtain high-dimensional configuration space for path planning problems. In typical cases, the path planning problems are solved directly in the 3-dimensional (D) workspace. However, this method is inefficient in handling the robots with various geometrical and mechanical restrictions. To overcome these difficulties, path planning may be formalized and solved in a new space which is called configuration space. The number of dimensions of the configuration space comes from the degree of freedoms of the system of interest. The method can be applied in two ways. In the first way, the point clouds of all the bodies of the system and interaction of them are used. The second way is performed via using the clearance function of simulation software where the minimum distances between surfaces of bodies are simultaneously measured. A double-turret system is held in the scope of this study. The 4-D configuration space of a double-turret system is obtained in these two ways. As a result, the difference between these two methods is around 1%, depending on the density of the point cloud. The disparity between the two forms steadily decreases as the point cloud density increases. At the end of the study, in order to verify 4-D configuration space obtained, 4-D path planning problem was realized as 2-D + 2-D and a sample path planning is carried out with using A* algorithm. Then, the accuracy of the configuration space is proved using the obtained paths on the simulation model of the double-turret system.

Keywords: A* algorithm, autonomous turrets, high-dimensional C-space, manifold C-space, point clouds

Procedia PDF Downloads 139
3458 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty

Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos

Abstract:

Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.

Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning

Procedia PDF Downloads 210
3457 The Optimization of TICSI in the Convergence Mechanism of Urban Water Management

Authors: M. Macchiaroli, L. Dolores, V. Pellecchia

Abstract:

With the recent Resolution n. 580/2019/R/idr, the Italian Regulatory Authority for Energy, Networks, and Environment (ARERA) for the Urban Water Management has introduced, for water managements characterized by persistent critical issues regarding the planning and organization of the service and the implementation of the necessary interventions for the improvement of infrastructures and management quality, a new mechanism for determining tariffs: the regulatory scheme of Convergence. The aim of this regulatory scheme is the overcoming of the Water Service Divided in order to improve the stability of the local institutional structures, technical quality, contractual quality, as well as in order to guarantee transparency elements for Users of the Service. Convergence scheme presupposes the identification of the cost items to be considered in the tariff in parametric terms, distinguishing three possible cases according to the type of historical data available to the Manager. The study, in particular, focuses on operations that have neither data on tariff revenues nor data on operating costs. In this case, the Manager's Constraint on Revenues (VRG) is estimated on the basis of a reference benchmark and becomes the starting point for defining the structure of the tariff classes, in compliance with the TICSI provisions (Integrated Text for tariff classes, ARERA's Resolution n. 665/2017/R/idr). The proposed model implements the recent studies on optimization models for the definition of tariff classes in compliance with the constraints dictated by TICSI in the application of the Convergence mechanism, proposing itself as a support tool for the Managers and the local water regulatory Authority in the decision-making process.

Keywords: decision-making process, economic evaluation of projects, optimizing tools, urban water management, water tariff

Procedia PDF Downloads 119
3456 Supercomputer Simulation of Magnetic Multilayers Films

Authors: Vitalii Yu. Kapitan, Aleksandr V. Perzhu, Konstantin V. Nefedev

Abstract:

The necessity of studying magnetic multilayer structures is explained by the prospects of their practical application as a technological base for creating new storages medium. Magnetic multilayer films have many unique features that contribute to increasing the density of information recording and the speed of storage devices. Multilayer structures are structures of alternating magnetic and nonmagnetic layers. In frame of the classical Heisenberg model, lattice spin systems with direct short- and long-range exchange interactions were investigated by Monte Carlo methods. The thermodynamic characteristics of multilayer structures, such as the temperature behavior of magnetization, energy, and heat capacity, were investigated. The processes of magnetization reversal of multilayer structures in external magnetic fields were investigated. The developed software is based on the new, promising programming language Rust. Rust is a new experimental programming language developed by Mozilla. The language is positioned as an alternative to C and C++. For the Monte Carlo simulation, the Metropolis algorithm and its parallel implementation using MPI and the Wang-Landau algorithm were used. We are planning to study of magnetic multilayer films with asymmetric Dzyaloshinskii–Moriya (DM) interaction, interfacing effects and skyrmions textures. This work was supported by the state task of the Ministry of Education and Science of the Russia # 3.7383.2017/8.9

Keywords: The Monte Carlo methods, Heisenberg model, multilayer structures, magnetic skyrmion

Procedia PDF Downloads 166