Search results for: cuckoo search optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4778

Search results for: cuckoo search optimization

1418 Particle Size Analysis of Itagunmodi Southwestern Nigeria Alluvial Gold Ore Sample by Gaudin Schumann Method

Authors: Olaniyi Awe, Adelana R. Adetunji, Abraham Adeleke

Abstract:

Mining of alluvial gold ore by artisanal miners has been going on for decades at Itagunmodi, Southwestern Nigeria. In order to optimize the traditional panning gravity separation method commonly used in the area, a mineral particle size analysis study is critical. This study analyzed alluvial gold ore samples collected at identified five different locations in the area with a view to determine the ore particle size distributions. 500g measured of as-received alluvial gold ore sample was introduced into the uppermost sieve of an electrical sieve shaker consisting of sieves arranged in the order of decreasing nominal apertures of 5600μm, 3350μm, 2800μm, 355μm, 250μm, 125μm and 90μm, and operated for 20 minutes. The amount of material retained on each sieve was measured and tabulated for analysis. A screen analysis graph using the Gaudin Schuman method was drawn for each of the screen tests on the alluvial samples. The study showed that the percentages of fine particle size -125+90 μm fraction were 45.00%, 36.00%, 39.60%, 43.00% and 36.80% for the selected samples. These primary ore characteristic results provide reference data for the alluvial gold ore processing method selection, process performance measurement and optimization.

Keywords: alluvial gold ore, sieve shaker, particle size, Gaudin Schumann

Procedia PDF Downloads 29
1417 Intrusion Detection in Computer Networks Using a Hybrid Model of Firefly and Differential Evolution Algorithms

Authors: Mohammad Besharatloo

Abstract:

Intrusion detection is an important research topic in network security because of increasing growth in the use of computer network services. Intrusion detection is done with the aim of detecting the unauthorized use or abuse in the networks and systems by the intruders. Therefore, the intrusion detection system is an efficient tool to control the user's access through some predefined regulations. Since, the data used in intrusion detection system has high dimension, a proper representation is required to show the basis structure of this data. Therefore, it is necessary to eliminate the redundant features to create the best representation subset. In the proposed method, a hybrid model of differential evolution and firefly algorithms was employed to choose the best subset of properties. In addition, decision tree and support vector machine (SVM) are adopted to determine the quality of the selected properties. In the first, the sorted population is divided into two sub-populations. These optimization algorithms were implemented on these sub-populations, respectively. Then, these sub-populations are merged to create next repetition population. The performance evaluation of the proposed method is done based on KDD Cup99. The simulation results show that the proposed method has better performance than the other methods in this context.

Keywords: intrusion detection system, differential evolution, firefly algorithm, support vector machine, decision tree

Procedia PDF Downloads 71
1416 Drippers Scaling Inhibition of the Localized Irrigation System by Green Inhibitors Based on Plant Extracts

Authors: Driouiche Ali, Karmal Ilham

Abstract:

The Agadir region is characterized by a dry climate, ranging from arid attenuated by oceanic influences to hyper-arid. The water mobilized in the agricultural sector of greater Agadir is 95% of underground origin and comes from the water table of Chtouka. The rest represents the surface waters of the Youssef Ben Tachfine dam. These waters are intended for the irrigation of 26880 hectares of modern agriculture. More than 120 boreholes and wells are currently exploited. Their depth varies between 10 m and 200 m and the unit flow rates of the boreholes are 5 to 50 l/s. A drop in the level of the water table of about 1.5 m/year, on average, has been observed during the last five years. Farmers are thus called upon to improve irrigation methods. Thus, localized or drip irrigation is adopted to allow rational use of water. The importance of this irrigation system is due to the fact that water is applied directly to the root zone and its compatibility with fertilization. However, this irrigation system faces a thorny problem which is the clogging of pipes and drippers. This leads to a lack of uniformity of irrigation over time. This so-called scaling phenomenon, the consequences of which are harmful (cleaning or replacement of pipes), leads to considerable unproductive expenditure. The objective set by this work is the search for green inhibitors likely to prevent this phenomenon of scaling. This study requires a better knowledge of these waters, their physico-chemical characteristics and their scaling power. Thus, using the "LCGE" controlled degassing technique, we initially evaluated, on pure calco-carbonic water at 30°F, the scaling-inhibiting power of some available plant extracts in our region of Souss-Massa. We then carried out a comparative study of the efficacy of these green inhibitors. The action of the most effective green inhibitor on real agricultural waters was then studied.

Keywords: green inhibitors, localized irrigation, plant extracts, scaling inhibition

Procedia PDF Downloads 70
1415 Conception of a Regulated, Dynamic and Intelligent Sewerage in Ostrevent

Authors: Rabaa Tlili Yaakoubi, Hind Nakouri, Olivier Blanpain

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of the CARDIO project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 40 to 100%. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 60% of total volume rejected to the natural environment and of 80 % in the number of discharges.

Keywords: RTC, paradigm, optimization, automation

Procedia PDF Downloads 270
1414 Benefits of Occupational Therapy for Children with Intellectual Disabilities in the Aspects of Vocational Activities and Instrumental Activities of Daily Life

Authors: Shakhawath Hossain, Tazkia Tahsin

Abstract:

Introduction/Background: Intellectual disability is a disability characterized by significant limitations both in intellectual functioning and in adaptive behavior, which covers many everyday social and practical skills. Vocational education is a multi-professional approach that is provided to individuals of working age with health-related impairments, limitations, or restrictions with work functioning and whose primary aim is to optimize work participation. Instrumental Activities of Daily Living activities to support daily life within the home and community. Like as community mobility, financial management, meal preparation, and clean-up, shopping. Material and Method: Electronic searches of Medline, PubMed, Google scholar, OT Seeker literature using the key terms of intellectual disability, vocational rehabilitation, instrumental activities of daily living and Occupational Therapy, as well as a thorough manual search for relevant literature. Results: There were 13 articles, all qualitative and quantitative, which are included in this review. All studies were mixed methods in design. To take the Occupational Therapy services, there is a significant improvement in their children's various areas like as sensory issues, cognitive abilities, perceptual skills, visual, motor planning, and group therapy. After taking the vocational and instrumental activities of daily living training children with intellectual disabilities to participate in their daily activities and work as an employee different company or organizations. Conclusion: The persons with intellectual disability are an integral part of our society who deserves social support and opportunities like other human beings. From the result section of the project papers, it is found that the significant benefits of Occupational Therapy services in the aspects of vocational and instrumental activities of daily living.

Keywords: occupational therapy, daily living activities, intellectual disabilities, instrumental ADL

Procedia PDF Downloads 119
1413 PointNetLK-OBB: A Point Cloud Registration Algorithm with High Accuracy

Authors: Wenhao Lan, Ning Li, Qiang Tong

Abstract:

To improve the registration accuracy of a source point cloud and template point cloud when the initial relative deflection angle is too large, a PointNetLK algorithm combined with an oriented bounding box (PointNetLK-OBB) is proposed. In this algorithm, the OBB of a 3D point cloud is used to represent the macro feature of source and template point clouds. Under the guidance of the iterative closest point algorithm, the OBB of the source and template point clouds is aligned, and a mirror symmetry effect is produced between them. According to the fitting degree of the source and template point clouds, the mirror symmetry plane is detected, and the optimal rotation and translation of the source point cloud is obtained to complete the 3D point cloud registration task. To verify the effectiveness of the proposed algorithm, a comparative experiment was performed using the publicly available ModelNet40 dataset. The experimental results demonstrate that, compared with PointNetLK, PointNetLK-OBB improves the registration accuracy of the source and template point clouds when the initial relative deflection angle is too large, and the sensitivity of the initial relative position between the source point cloud and template point cloud is reduced. The primary contribution of this paper is the use of PointNetLK to avoid the non-convex problem of traditional point cloud registration and leveraging the regularity of the OBB to avoid the local optimization problem in the PointNetLK context.

Keywords: mirror symmetry, oriented bounding box, point cloud registration, PointNetLK-OBB

Procedia PDF Downloads 137
1412 Nectariferous Plant Genetic Resources for Apicultural Entrepreneurship in Nigeria: Prerequisite for Conservation, Sustainable Management and Policy

Authors: C. V. Nnamani, O. L. Adedeji

Abstract:

The contemporary global economic meltdown has devastating effect on the Nigerian’s economy and its frantic search for alternative source of national revenue aside from oil and gas has become imperative for economic emancipation for Nigerians. Apicultural entrepreneurship could provide a source of livelihood if the basic knowledge of those plant genetic resources needed by bees is made available. A palynological evaluation of those palynotaxa which honey bees forage for pollen and nectar was carried out after standard acetolysis method. Results showed that the honey samples were highly diversified and rich in honey plants. A total of 9544.3 honey pollen, consisting of 39 honey plants belonging to 21 plant families and distributed within 38 genera were identified excluding 238 unidentified pollen grains. Data from the analysis equally revealed that Elaeis guineensis Jacq, Anacardium occidentale L, Diospyros mespiliformis Hochist xe ADC, Alchornea cordifolia Muell, Arg, Daniella oliveri (Rolfe) Hutch & Dalz, Irvingia wombolu Okafor ex Baill, Treculia africana Decne, Nauclea latifolia Smith and Crossopteryx febrifuga Afzil ex Benth were the predominant honey plants. It provided a guide to the optimal utilization of floral resources by honeybees in these regions, showing the opportunity and amazing potentials for apiculture entrepreneurship of these palytaxa. Most of these plants are rare, threatened and endangered. It calls for urgent conservation techniques and step by all players. Critical awareness creation to ensure farmers knowledge of these palynotaxa to ensure proper understanding and attendance boost from them as economic empowerment is needed.

Keywords: palynotaxa, acetolysis, enterprise, livelihood, Nigeria

Procedia PDF Downloads 279
1411 Optimization of Energy Harvesting Systems for RFID Applications

Authors: P. Chambe, B. Canova, A. Balabanian, M. Pele, N. Coeur

Abstract:

To avoid battery assisted tags with limited lifetime batteries, it is proposed here to replace them by energy harvesting systems, able to feed from local environment. This would allow total independence to RFID systems, very interesting for applications where tag removal from its location is not possible. Example is here described for luggage safety in airports, and is easily extendable to similar situation in terms of operation constraints. The idea is to fix RFID tag with energy harvesting system not only to identify luggage but also to supply an embedded microcontroller with a sensor delivering luggage weight making it impossible to add or to remove anything from the luggage during transit phases. The aim is to optimize the harvested energy for such RFID applications, and to study in which limits these applications are theoretically possible. Proposed energy harvester is based on two energy sources: piezoelectricity and electromagnetic waves, so that when the luggage is moving on ground transportation to airline counters, the piezo module supplies the tag and its microcontroller, while the RF module operates during luggage transit thanks to readers located along the way. Tag location on the luggage is analyzed to get best vibrations, as well as harvester better choice for optimizing the energy supply depending on applications and the amount of energy harvested during a period of time. Effects of system parameters (RFID UHF frequencies, limit distance between the tag and the antenna necessary to harvest energy, produced voltage and voltage threshold) are discussed and working conditions for such system are delimited.

Keywords: RFID tag, energy harvesting, piezoelectric, EM waves

Procedia PDF Downloads 437
1410 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 568
1409 Production of Premium Quality Cinnamon Bark Powder Using Cryogenic Grinding

Authors: Monika R. Bhoi, R. F. Sutar, Bhaumik B. Patel

Abstract:

The objective of this research paper is to obtain the premium quality of cinnamon bark powder through cryogenic grinding technology. The effect of grinding temperature (0, -20, -40, -60, -80 and -100˚C), feed rate (8, 9 and 10 kg/h), and sieve size (0.8, 1.0 and 1.5 mm) were evaluated with respect to grinding time, volatile oil content, particle size, energy consumption, and liquid nitrogen consumption. Cryogenic grinding process parameters were optimized to obtain premium quality cinnamon bark powder was carried out using three factorial completely randomized design. The optimization revealed that grinding of cinnamon bark at -80⁰C temperature using 0.8 mm sieve size and 10 kg/h feed rate resulted in premium quality cinnamon bark powder containing volatile oil 3.01%. In addition, volatile oil retention in cryogenically ground powder was 88.23%, whereas control (ambient grinding) had 33.11%. Storage study of premium quality cryogenically ground powder was carried out under accelerated storage conditions (38˚C & 90% R.H). Accelerated storage of cryoground powder was found to be advantageous over the conventional ground for extended storage of the ground cinnamon powder with retention of its nutritional quality. Hence, grinding of spices at optimally low cryogenic temperature is a promising technology for the production of its premium quality powder economically.

Keywords: cinnamon bark, cryogenic grinding, feed rate, volatile oil

Procedia PDF Downloads 149
1408 Theoretical Study of Structural and Electronic Properties of Matlockite CaFX (X = I and Br) Compounds

Authors: Meriem Harmel, Houari Khachai

Abstract:

The full potential linearized augmented plane wave (FP-LAPW)method within density functional theory is applied to study, for the first time, the structural and electronic properties of CaFI and to compare them with CaFCl and CaFBr, all compounds belonging to the tetragonal PbFCl structure group with space group P4/nmm. We used the generalized gradient approximation (GGA) based on exchange–correlation energy optimization to calculate the total energy and also the Engel– Vosko GGA formalism, which optimizes the corresponding potential for band structure calculations. Ground state properties such as the lattice parameters, c/a ratio, bulk modulus, pressure derivative of the bulk modulus and cohesive energy are calculated, as well as the optimized internal parameters, by relaxing the atomic position in the force directions. The variations of the calculated interatomic distances and angles between different atomic bonds are discussed. CaFCl was found to have a direct band gap at whereas CaFBr and BaFI have indirect band gaps. From these computed bands, all three materials are found to be insulators having band gaps of 6.28, 5.46, and 4.50 eV, respectively. We also calculated the valence charge density and the total density of states at equilibrium volume for each compound. The results are in reasonable agreement with the available experimental data.

Keywords: DFT, matlockite, structural properties, electronic structure

Procedia PDF Downloads 302
1407 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 354
1406 Assessment of Pier Foundations for Onshore Wind Turbines in Non-cohesive Soil

Authors: Mauricio Terceros, Jann-Eike Saathoff, Martin Achmus

Abstract:

In non-cohesive soil, onshore wind turbines are often found on shallow foundations with a circular or octagonal shape. For the current generation of wind turbines, shallow foundations with very large breadths are required. The foundation support costs thus represent a considerable portion of the total construction costs. Therefore, an economic optimization of the type of foundation is highly desirable. A conceivable alternative foundation type would be a pier foundation, which combines the load transfer over the foundation area at the pier base with the transfer of horizontal loads over the shaft surface of the pier. The present study aims to evaluate the load-bearing behavior of a pier foundation based on comprehensive parametric studies. Thereby, three-dimensional numerical simulations of both pier and shallow foundations are developed. The evaluation of the results focuses on the rotational stiffnesses of the proposed soil-foundation systems. In the design, the initial rotational stiffness is decisive for consideration of natural frequencies, whereas the rotational secant stiffness for a maximum load is decisive for serviceability considerations. A systematic analysis of the results at different load levels shows that the application of the typical pier foundation is presumably limited to relatively small onshore wind turbines.

Keywords: onshore wind foundation, pier foundation, rotational stiffness of soil-foundation system, shallow foundation

Procedia PDF Downloads 142
1405 Phosphorus Recovery Optimization in Microbial Fuel Cell

Authors: Abdullah Almatouq

Abstract:

Understanding the impact of key operational variables on concurrent energy generation and phosphorus recovery in microbial fuel cell is required to improve the process and reduce the operational cost. In this study, full factorial design (FFD) and central composite designs (CCD) were employed to identify the effect of influent COD concentration and cathode aeration flow rate on energy generation and phosphorus (P) recovery and to optimise MFC power density and P recovery. Results showed that influent chemical oxygen demand (COD) concentration and cathode aeration flow rate had a significant effect on power density, coulombic efficiency, phosphorus precipitation efficiency and phosphorus precipitation rate at the cathode. P precipitation was negatively affected by the generated current during the batch duration. The generated energy was reduced due to struvite being precipitated on the cathode surface, which might obstruct the mass transfer of ions and oxygen. Response surface mathematical model was used to predict the optimum operating conditions that resulted in a maximum power density and phosphorus precipitation efficiency of 184 mW/m² and 84%, and this corresponds to COD= 1700 mg/L and aeration flow rate=210 mL/min. The findings highlight the importance of the operational conditions of energy generation and phosphorus recovery.

Keywords: energy, microbial fuel cell, phosphorus, struvite

Procedia PDF Downloads 142
1404 Analytical Solutions for Tunnel Collapse Mechanisms in Circular Cross-Section Tunnels under Seepage and Seismic Forces

Authors: Zhenyu Yang, Qiunan Chen, Xiaocheng Huang

Abstract:

Reliable prediction of tunnel collapse remains a prominent challenge in the field of civil engineering. In this study, leveraging the nonlinear Hoek-Brown failure criterion and the upper-bound theorem, an analytical solution for the collapse surface of shallowly buried circular tunnels was derived, taking into account the coupled effects of surface loads and pore water pressures. Initially, surface loads and pore water pressures were introduced as external force factors, equating the energy dissipation rate to the external force, yielding our objective function. Subsequently, the variational method was employed for optimization, and the outcomes were juxtaposed with previous research findings. Furthermore, we utilized the deduced equation set to systematically analyze the influence of various rock mass parameters on collapse shape and extent. To validate our analytical solutions, a comparison with prior studies was executed. The corroboration underscored the efficacy of our proposed methodology, offering invaluable insights for collapse risk assessment in practical engineering applications.

Keywords: tunnel roof stability, analytical solution, hoek–brown failure criterion, limit analysis

Procedia PDF Downloads 67
1403 Alternate Methods to Visualize 2016 U.S. Presidential Election Result

Authors: Hong Beom Hur

Abstract:

Politics in America is polarized. The best illustration of this is the 2016 presidential election result map. States with megacities like California, New York, Illinois, Virginia, and others are marked blue to signify the color of the Democratic party. States located in inland and south like Texas, Florida, Tennesse, Kansas and others are marked red to signify the color of the Republican party. Such a stark difference between two colors, red and blue, combined with geolocations of each state with their borderline remarks one central message; America is divided into two colors between urban Democrats and rural Republicans. This paper seeks to defy the visualization by pointing out its limitations and search for alternative ways to visualize the 2016 election result. One such limitation is that geolocations of each state and state borderlines limit the visualization of population density. As a result, the election result map does not convey the fact that Clinton won the popular vote and only accentuates the voting patterns of urban and rural states. The paper seeks whether an alternative narrative can be observed by factoring in the population number into the size of each state and manipulating the state borderline according to the normalization. Yet another alternative narrative may be reached by factoring the size of each state by the number of the electoral college of each state by voting and visualize the number. Other alternatives will be discussed but are not implemented in visualization. Such methods include dividing the land of America into about 120 million cubes each representing a voter or by the number of whole population 300 million cubes. By exploring these alternative methods to visualize the politics of the 2016 election map, the public may be able to question whether it is possible to be free from the narrative of the divide-conquer when interpreting the election map and to look at both parties as a story of the United States of America.

Keywords: 2016 U.S. presidential election, data visualization, population scale, geo-political

Procedia PDF Downloads 106
1402 Optimization of Sequential Thermophilic Bio-Hydrogen/Methane Production from Mono-Ethylene Glycol via Anaerobic Digestion: Impact of Inoculum to Substrate Ratio and N/P Ratio

Authors: Ahmed Elreedy, Ahmed Tawfik

Abstract:

This investigation aims to assess the effect of inoculum to substrate ratio (ISR) and nitrogen to phosphorous balance on simultaneous biohydrogen and methane production from anaerobic decomposition of mono-ethylene glycol (MEG). Different ISRs were applied in the range between 2.65 and 13.23 gVSS/gCOD, whereas the tested N/P ratios were changed from 4.6 to 8.5; both under thermophilic conditions (55°C). The maximum obtained methane and hydrogen yields (MY and HY) of 151.86±10.8 and 22.27±1.1 mL/gCODinitial were recorded at ISRs of 5.29 and 3.78 gVSS/gCOD, respectively. Unlikely, the ammonification process, in terms of net ammonia produced, was found to be ISR and COD/N ratio dependent, reaching its peak value of 515.5±31.05 mgNH4-N/L at ISR and COD/N ratio of 13.23 gVSS/gCOD and 11.56. The optimum HY was enhanced by more than 1.45-fold with declining N/P ratio from 8.5 to 4.6; whereas, the MY was improved (1.6-fold), while increasing N/P ratio from 4.6 to 5.5 with no significant impact at N/P ratio of 8.5. The results obtained revealed that the methane production was strongly influenced by initial ammonia, compared to initial phosphate. Likewise, the generation of ammonia was markedly deteriorated from 535.25±41.5 to 238.33±17.6 mgNH4-N/L with increasing N/P ratio from 4.6 to 8.5. The kinetic study using Modified Gompertz equation was successfully fitted to the experimental outputs (R2 > 0.9761).

Keywords: mono-ethylene glycol, biohydrogen and methane, inoculum to substrate ratio, nitrogen to phosphorous balance, ammonification

Procedia PDF Downloads 365
1401 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the PTH percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: reliability, accelerated life testing, cumulative exposure model, Bayesian estimation, progressive type-I censoring, Weibull distribution

Procedia PDF Downloads 493
1400 A Data-Driven Agent Based Model for the Italian Economy

Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio

Abstract:

We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.

Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data

Procedia PDF Downloads 53
1399 SAR and B₁ Considerations for Multi-Nuclear RF Body Coils

Authors: Ria Forner

Abstract:

Introduction: Due to increases in the SNR at 7T and above, it becomes more favourable to make use of X-nuclear imaging. Integrated body coils tuned to 120MHz for 31P, 79MHz for 23Na, and 75 MHz for 13C at 7T were simulated with a human male, female, or child body model to assess strategies of use for metabolic MR imaging in the body. Methods: B1 and SAR efficiencies in the heart, liver, spleen, and kidneys were assessed using numerical simulations over the three frequencies with phase shimming. Results: B1+ efficiency is highly variable over the different organs, particularly for the highest frequency; however, local SAR efficiency remains relatively constant over the frequencies in all subjects. Although the optimal phase settings vary, one generic phase setting can be identified for each frequency at which the penalty in B1+ is at a max of 10%. Discussion: The simulations provide practical strategies for power optimization, B1 management, and maintaining safety. As expected, the B1 field is similar at 75MHz and 79MHz, but reduced at 120MHz. However, the B1 remains relatively constant when normalised by the square root of the peak local SAR. This is in contradiction to generalized SAR considerations of 1H MRI at different field strengths, which is defined by global SAR instead. Conclusion: Although the B1 decreases with frequency, SAR efficiency remains constant throughout the investigated frequency range. It is possible to shim the body coil to obtain a maximum of 10% extra B1+ in a specific organ in a body when compared to a generic setting.

Keywords: birdcage, multi-nuclear, B1 shimming, 7 Tesla MRI, liver, kidneys, heart, spleen

Procedia PDF Downloads 45
1398 Adsorption of Atmospheric Gases Using Atomic Clusters

Authors: Vidula Shevade, B. J. Nagare, Sajeev Chacko

Abstract:

First principles simulation, meaning density functional theory (DFT) calculations with plane waves and pseudopotential, has become a prized technique in condensed matter theory. Nanoparticles (NP) have been known to possess good catalytic activities, especially for molecules such as CO, O₂, etc. Among the metal NPs, Aluminium based NPs are also widely known for their catalytic properties. Aluminium metal is a lightweight, excellent electrical, and thermal abundant chemical element in the earth’s crust. Aluminium NPs, when added to solid rocket fuel, help improve the combustion speed and considerably increase combustion heat and combustion stability. Adding aluminium NPs into normal Al/Al₂O₃ powder improves the sintering processes of the ceramics, with high heat transfer performance, increased density, and enhanced thermal conductivity of the sinter. We used VASP and Gaussian 0₃ package to compute the geometries, electronic structure, and bonding properties of Al₁₂Ni as well as its interaction with O₂ and CO molecules. Several MD simulations were carried out using VASP at various temperatures from which hundreds of structures were optimized, leading to 24 unique structures. These structures were then further optimized through a Gaussian package. The lowest energy structure of Al₁₂Ni has been reported to be a singlet. However, through our extensive search, we found a triplet state to be lower in energy. In our structure, the Ni atom is found to be on the surface, which gives the non-zero magnetic moment. Incidentally, O2 and CO molecules are also triplet in nature, due to which the Al₁₂-Ni cluster is likely to facilitate the oxidation process of the CO molecule. Our results show that the most favourable site for the CO molecule is the Ni atom and that for the O₂ molecule is the Al atom that is nearest to the Ni atom. Al₁₂Ni-O₂ and Al₁₂-Ni-CO structures we extracted using VMD. Al₁₂Ni nanocluster, due to in triplet electronic structure configuration, indicates it to be a potential candidate as a catalyst for oxidation of CO molecules.

Keywords: catalyst, gaussian, nanoparticles, oxidation

Procedia PDF Downloads 82
1397 Anti-Oxidant and Anti-Cancer Activity of Helix aspersa Aqueous Extract

Authors: Ibtissem El Ouar, Cornelia Braicu, Dalila Naimi, Alexendru Irimie, Ioana Berindan-Neagoe

Abstract:

Helix aspersa, 'the garden snail' is a big land snail widely found in the Mediterranean countries, it is one of the most consumed species in the west of Algeria. It is commonly used in zootherapy to purify blood and to treat cardiovascular diseases and liver problems. The aim of our study is to investigate, the antitumor activity of an aqueous extract from Helix aspersa prepared by the traditional method on Hs578T; a triple negative breast cancer cell line. Firstly, the free radical scavenging activity of H. aspersa extract was assessed by measuring its capability for scavenging the radical 2,2-diphenyl-1-picrylhydrazyl (DPPH), as well as its ability to reduce ferric ion by the FRAP assay (ferric reducing ability). The cytotoxic effect of H. aspersa extract against Hs578T cells was evaluated by the MTT test (3-(4,5- dimethylthiazl-2-yl)-2,5- diphenyltetrazolium bromide)) while the mode of cell death induced by the extract has been determined by fluorescence microscopy using acredine orange/ethidium bromide (AO/EB) probe. The level of TNFα has also measured in cell medium by ELISA method. The results suggest that H. aspersa extract has an antioxidant activity, especially at high concentrations, it can reduce DPPH radical and ferric ion. The MTT test shows that H. aspersa extract has a great cytotoxic effect against breast cancer cells, the IC50 value correspond of the dilution 1% of the crude extract. Moreover, the AO/EB staining shows that TNFα induced necrosis is the main form of cell death induced by the extract. In conclusion, the present study may open new perspectives in the search for new natural anticancer drugs.

Keywords: breast cancer, Helix aspersa, Hs578t cell line, necrosis

Procedia PDF Downloads 403
1396 Software Development to Empowering Digital Libraries with Effortless Digital Cataloging and Access

Authors: Abdul Basit Kiani

Abstract:

The software for the digital library system is a cutting-edge solution designed to revolutionize the way libraries manage and provide access to their vast collections of digital content. This advanced software leverages the power of technology to offer a seamless and user-friendly experience for both library staff and patrons. By implementing this software, libraries can efficiently organize, store, and retrieve digital resources, including e-books, audiobooks, journals, articles, and multimedia content. Its intuitive interface allows library staff to effortlessly manage cataloging, metadata extraction, and content enrichment, ensuring accurate and comprehensive access to digital materials. For patrons, the software offers a personalized and immersive digital library experience. They can easily browse the digital catalog, search for specific items, and explore related content through intelligent recommendation algorithms. The software also facilitates seamless borrowing, lending, and preservation of digital items, enabling users to access their favorite resources anytime, anywhere, on multiple devices. With robust security features, the software ensures the protection of intellectual property rights and enforces access controls to safeguard sensitive content. Integration with external authentication systems and user management tools streamlines the library's administration processes, while advanced analytics provide valuable insights into patron behavior and content usage. Overall, this software for the digital library system empowers libraries to embrace the digital era, offering enhanced access, convenience, and discoverability of their vast collections. It paves the way for a more inclusive and engaging library experience, catering to the evolving needs of tech-savvy patrons.

Keywords: software development, empowering digital libraries, digital cataloging and access, management system

Procedia PDF Downloads 57
1395 Qualitative and Quantitative Methods in Multidisciplinary Fields Collection Development

Authors: Hui Wang

Abstract:

Traditional collection building approaches are limited in breadth and scope and are not necessarily suitable for multidisciplinary fields development in the institutes of the Chinese Academy of Sciences. The increasing of multidisciplinary fields researches require a viable approach to collection development in these libraries. This study uses qualitative and quantitative analysis to assess collection. The quantitative analysis consists of three levels of evaluation, which including realistic demand, potential demand and trend demand analysis. For one institute, three samples were separately selected from the object institute, more than one international top institutes in highly relative research fields and future research hotspots. Each sample contains an appropriate number of papers published in recent five years. Several keywords and the organization names were reasonably combined to search in commercial databases and the institutional repositories. The publishing information and citations in the bibliographies of these papers were selected to build the dataset. One weighted evaluation model and citation analysis were used to calculate the demand intensity index of every journal and book. Principal Investigator selector and database traffic provide a qualitative evidence to describe the demand frequency. The demand intensity, demand frequency and academic committee recommendations were comprehensively considered to recommend collection development. The collection gaps or weaknesses were ascertained by comparing the current collection and the recommend collection. This approach was applied in more than 80 institutes’ libraries in Chinese Academy of Sciences in the past three years. The evaluation results provided an important evidence for collections building in the second year. The latest user survey results showed that the updated collection’s capacity to support research in a multidisciplinary subject area have increased significantly.

Keywords: citation analysis, collection assessment, collection development, quantitative analysis

Procedia PDF Downloads 194
1394 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop

Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen

Abstract:

Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.

Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.

Procedia PDF Downloads 13
1393 A Cloud-Based Spectrum Database Approach for Licensed Shared Spectrum Access

Authors: Hazem Abd El Megeed, Mohamed El-Refaay, Norhan Magdi Osman

Abstract:

Spectrum scarcity is a challenging obstacle in wireless communications systems. It hinders the introduction of innovative wireless services and technologies that require larger bandwidth comparing to legacy technologies. In addition, the current worldwide allocation of radio spectrum bands is already congested and can not afford additional squeezing or optimization to accommodate new wireless technologies. This challenge is a result of accumulative contributions from different factors that will be discussed later in this paper. One of these factors is the radio spectrum allocation policy governed by national regulatory authorities nowadays. The framework for this policy allocates specified portion of radio spectrum to a particular wireless service provider on exclusive utilization basis. This allocation is executed according to technical specification determined by the standard bodies of each Radio Access Technology (RAT). Dynamic access of spectrum is a framework for flexible utilization of radio spectrum resources. In this framework there is no exclusive allocation of radio spectrum and even the public safety agencies can share their spectrum bands according to a governing policy and service level agreements. In this paper, we explore different methods for accessing the spectrum dynamically and its associated implementation challenges.

Keywords: licensed shared access, cognitive radio, spectrum sharing, spectrum congestion, dynamic spectrum access, spectrum database, spectrum trading, reconfigurable radio systems, opportunistic spectrum allocation (OSA)

Procedia PDF Downloads 408
1392 A Condition-Based Maintenance Policy for Multi-Unit Systems Subject to Deterioration

Authors: Nooshin Salari, Viliam Makis

Abstract:

In this paper, we propose a condition-based maintenance policy for multi-unit systems considering the existence of economic dependency among units. We consider a system composed of N identical units, where each unit deteriorates independently. Deterioration process of each unit is modeled as a three-state continuous time homogeneous Markov chain with two working states and a failure state. The average production rate of units varies in different working states and demand rate of the system is constant. Units are inspected at equidistant time epochs, and decision regarding performing maintenance is determined by the number of units in the failure state. If the total number of units in the failure state exceeds a critical level, maintenance is initiated, where units in failed state are replaced correctively and deteriorated state units are maintained preventively. Our objective is to determine the optimal number of failed units to initiate maintenance minimizing the long run expected average cost per unit time. The problem is formulated and solved in the semi-Markov decision process (SMDP) framework. A numerical example is developed to demonstrate the proposed policy and the comparison with the corrective maintenance policy is presented.

Keywords: reliability, maintenance optimization, semi-Markov decision process, production

Procedia PDF Downloads 142
1391 Analysis of Influence of Geometrical Set of Nozzles on Aerodynamic Drag Level of a Hero’s Based Steam Turbine

Authors: Mateusz Paszko, Miroslaw Wendeker, Adam Majczak

Abstract:

High temperature waste energy offers a number of management options. The most common energy recuperation systems, that are actually used to utilize energy from the high temperature sources are steam turbines working in a high pressure and temperature closed cycles. Due to the high costs of production of energy recuperation systems, especially rotary turbine discs equipped with blades, currently used solutions are limited in use with waste energy sources of temperature below 100 °C. This study presents the results of simulating the flow of the water vapor in various configurations of flow ducts in a reaction steam turbine based on Hero’s steam turbine. The simulation was performed using a numerical model and the ANSYS Fluent software. Simulation computations were conducted with use of the water vapor as an internal agent powering the turbine, which is fully safe for an environment in case of a device failure. The conclusions resulting from the conducted numerical computations should allow for optimization of the flow ducts geometries, in order to achieve the greatest possible efficiency of the turbine. It is expected that the obtained results should be useful for further works related to the development of the final version of a low drag steam turbine dedicated for low cost energy recuperation systems.

Keywords: energy recuperation, CFD analysis, waste energy, steam turbine

Procedia PDF Downloads 195
1390 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet

Authors: Justin Woulfe

Abstract:

Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.

Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics

Procedia PDF Downloads 143
1389 Technical and Economic Evaluation of Harmonic Mitigation from Offshore Wind Power Plants by Transmission Owners

Authors: A. Prajapati, K. L. Koo, F. Ghassemi, M. Mulimakwenda

Abstract:

In the UK, as the volume of non-linear loads connected to transmission grid continues to rise steeply, the harmonic distortion levels on transmission network are becoming a serious concern for the network owners and system operators. This paper outlines the findings of the study conducted to verify the proposal that the harmonic mitigation could be optimized and can be managed economically and effectively at the transmission network level by the Transmission Owner (TO) instead of the individual polluter connected to the grid. Harmonic mitigation studies were conducted on selected regions of the transmission network in England for recently connected offshore wind power plants to strategize and optimize selected harmonic filter options. The results – filter volume and capacity – were then compared against the mitigation measures adopted by the individual connections. Estimation ratios were developed based on the actual installed and optimal proposed filters. These estimation ratios were then used to derive harmonic filter requirements for future contracted connections. The study has concluded that a saving of 37% in the filter volume/capacity could be achieved if the TO is to centrally manage the harmonic mitigation instead of individual polluter installing their own mitigation solution.

Keywords: C-type filter, harmonics, optimization, offshore wind farms, interconnectors, HVDC, renewable energy, transmission owner

Procedia PDF Downloads 148