Search results for: optimal volume
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5595

Search results for: optimal volume

3705 The Influence of Forest Management Histories on Dead and Habitat Trees in the Old Growth Forest in Northern Iran

Authors: Kiomars Sefidi

Abstract:

Dead and habitat tree such as fallen logs, snags, stumps and cracks and loos bark etc. is regarded as an important ecological component of forests on which many forest dwelling species depend, yet its relation to management history in Caspian forest has gone unreported. The aim of research was to compare the amounts of dead tree and habitat in the forests with historically different intensities of management, including: forests with the long term implication of management (PS), the short-term implication of management (NS) which were compared with semi virgin forest (GS). The number of 405 individual dead and habitat trees were recorded and measured at 109 sampling locations. ANOVA revealed volume of the dead tree in the form and decay classes significantly differ within sites and dead volume in the semi virgin forest significantly higher than managed sites. Comparing the amount of dead and habitat tree in three sites showed that dead tree volume related with management history and significantly differ in three study sites. Also, the numbers of habitat trees including cavities, Cracks and loose bark and Fork split trees significantly vary among sites. Reaching their highest in virgin site and their lowest in the site with the long term implication of management, it was concluded that forest management cause reduction of the amount of dead and habitat tree. Forest management history affect the forest's ability to generate dead tree especially in a large size, thus managing this forest according to ecological sustainable principles require a commitment to maintaining stand structure that allow, continued generation of dead tree in a full range of size.

Keywords: forest biodiversity, cracks trees, fork split trees, sustainable management, Fagus orientalis, Iran

Procedia PDF Downloads 552
3704 Recovery of Helicobacter Pylori from Stagnant and Moving Water Biofilms

Authors: Maryam Zafar, Sajida Rasheed, Imran Hashmi

Abstract:

Water as an environmental reservoir is reported to act as a habitat and transmission route to microaerophilic bacteria such as Heliobacter pylori. It has been studied that in biofilms are the predominant dwellings for the bacteria to grow in water and protective reservoir for numerous pathogens by protecting them against harsh conditions, such as shear stress, low carbon concentration and less than optimal temperature. In this study, influence of these and many other parameters was studied on H. pylori in stagnant and moving water biofilms both in surface and underground aquatic reservoirs. H. pylori were recovered from pipe of different materials such as Polyvinyl Chloride, Polypropylene and Galvanized iron pipe cross sections from an urban water distribution network. Biofilm swabbed from inner cross section was examined by molecular biology methods coupled with gene sequencing and H. pylori 16S rRNA peptide nucleic acid probe showing positive results for H. pylori presence. Studies showed that pipe material affect growth of biofilm which in turn provide additional survival mechanism for pathogens like H. pylori causing public health concerns.

Keywords: biofilm, gene sequencing, heliobacter pylori, pipe materials

Procedia PDF Downloads 359
3703 Biosynthesis of L-Xylose from Xylitol Using a Dual Enzyme Cascade in Escherichia coli

Authors: Mesfin Angaw Tesfay

Abstract:

L-xylose is an important intermediate in the pharmaceutical industry, playing a key role in the production of various antiviral and anticancer drugs. Despite its significance, L-xylose is a rare and costly sugar with limited availability in nature. In recent years, enzymatic production methods have garnered considerable attention due to their benefits over conventional chemical synthesis. In this research, a dual enzyme cascade system was developed to synthesize L-xylose from an inexpensive substrate, xylitol. The study involved cloning and co-expressing two key genes: the L-fucose isomerase (L-fucI) gene from Escherichia coli K-12 and the xylitol-4-dehydrogenase (xdh) gene from Pantoea ananatis ATCC 43072 in Escherichia coli. The resulting recombinant cells, engineered with the PET28a-xdh/L-fucI vector, were able to effectively convert xylitol to L-xylose. The system showed optimal performance at 40°C and a pH of 10.0. Moreover, Zn²⁺ (7.5 mM) enhanced the catalytic activity by 1.34 times. This approach yielded 52.2 g/L of L-xylose from an initial 80 g/L xylitol concentration, with a 65% conversion efficiency and a productivity rate of 1.86. The study highlights a practical method for producing L-xylose from xylitol through a co-expression system carrying the L-fucI and xdh genes.

Keywords: l-fucose isomerase, xylitol-4-dehydrogenase, l-xylose, xylitol, co-expression

Procedia PDF Downloads 23
3702 Using Computational Fluid Dynamics to Model and Design a Preventative Application for Strong Wind

Authors: Ming-Hwi Yao, Su-Szu Yang

Abstract:

Typhoons are one of the major types of disasters that affect Taiwan each year and that cause severe damage to agriculture. Indeed, the damage exacted during a typical typhoon season can be up to $1 billion, and is responsible for nearly 75% of yearly agricultural losses. However, there is no consensus on how to reduce the damage caused by the strong winds and heavy precipitation engendered by typhoons. One suggestion is the use of windbreak nets, which are a low-cost and easy-to-use disaster mitigation strategy for crop production. In the present study, we conducted an evaluation to determine the optimal conditions of a windbreak net by using a computational fluid dynamics (CFD) model. This model may be used as a reference for crop protection. The results showed that CFD simulation validated windbreak nets of different mesh sizes and heights in the experimental area; thus, CFD is an efficient tool for evaluating the effectiveness of windbreak nets. Specifically, the effective wind protection length and height were found to be 6 and 1.3 times the length and height of the windbreak net, respectively. During a real typhoon, maximum wind gusts of 18 m s-1 can be reduced to 4 m s-1 by using a windbreak net that has a 70% blocking rate. In short, windbreak nets are significantly effective in protecting typhoon-affected areas.

Keywords: computational fluid dynamics, disaster, typhoon, windbreak net

Procedia PDF Downloads 191
3701 Enhanced Extra Trees Classifier for Epileptic Seizure Prediction

Authors: Maurice Ntahobari, Levin Kuhlmann, Mario Boley, Zhinoos Razavi Hesabi

Abstract:

For machine learning based epileptic seizure prediction, it is important for the model to be implemented in small implantable or wearable devices that can be used to monitor epilepsy patients; however, current state-of-the-art methods are complex and computationally intensive. We use Shapley Additive Explanation (SHAP) to find relevant intracranial electroencephalogram (iEEG) features and improve the computational efficiency of a state-of-the-art seizure prediction method based on the extra trees classifier while maintaining prediction performance. Results for a small contest dataset and a much larger dataset with continuous recordings of up to 3 years per patient from 15 patients yield better than chance prediction performance (p < 0.004). Moreover, while the performance of the SHAP-based model is comparable to that of the benchmark, the overall training and prediction time of the model has been reduced by a factor of 1.83. It can also be noted that the feature called zero crossing value is the best EEG feature for seizure prediction. These results suggest state-of-the-art seizure prediction performance can be achieved using efficient methods based on optimal feature selection.

Keywords: machine learning, seizure prediction, extra tree classifier, SHAP, epilepsy

Procedia PDF Downloads 111
3700 An Experimental Study of Downstream Structures on the Flow-Induced Vibrations Energy Harvester Performances

Authors: Pakorn Uttayopas, Chawalit Kittichaikarn

Abstract:

This paper presents an experimental investigation for the characteristics of an energy harvesting device exploiting flow-induced vibration in a wind tunnel. A stationary bluff body is connected with a downstream tip body via an aluminium cantilever beam. Various lengths of aluminium cantilever beam and different shapes of downstream tip body are considered. The results show that the characteristics of the energy harvester’s vibration depend on both the length of the aluminium cantilever beam and the shape of the downstream tip body. The highest ratio between vibration amplitude and bluff body diameter was found to be 1.39 for an energy harvester with a symmetrical triangular tip body and L/D1 = 5 at 9.8 m/s of flow speed (Re = 20077). Using this configuration, the electrical energy was extracted with a polyvinylidene fluoride (PVDF) piezoelectric beam with different load resistances, of which the optimal value could be found on each Reynolds number. The highest power output was found to be 3.19 µW, at 9.8 m/s of flow speed (Re = 20077) and 27 MΩ of load resistance.

Keywords: downstream structures, energy harvesting, flow-induced vibration, piezoelectric material, wind tunnel

Procedia PDF Downloads 232
3699 Optimization in Friction Stir Processing Method with Emphasis on Optimized Process Parameters Laboratory Research

Authors: Atabak Rahimzadeh Ilkhch

Abstract:

Friction stir processing (FSP) has promised for application of thermo-mechanical processing techniques where aims to change the micro structural and mechanical properties of materials in order to obtain high performance and reducing the production time and cost. There are lots of studies focused on the microstructure of friction stir welded aluminum alloys. The main focus of this research is on the grain size obtained in the weld zone. Moreover in second part focused on temperature distribution effect over the entire weld zone and its effects on the microstructure. Also, there is a need to have more efforts on investigating to obtain the optimal value of effective parameters such as rotational speed on microstructure and to use the optimum tool designing method. the final results of this study will be present the variation of structural and mechanical properties of materials in the base of applying Friction stir processing and effect of (FSP) processing and tensile testing on surface quality. in the hand, this research addresses the FSP f AA-7020 aluminum and variation f ration of rotation and translational speeds.

Keywords: friction stir processing, AA-7020, thermo-mechanical, microstructure, temperature

Procedia PDF Downloads 279
3698 A Novel Solution Methodology for Transit Route Network Design Problem

Authors: Ghada Moussa, Mamoud Owais

Abstract:

Transit Route Network Design Problem (TrNDP) is the most important component in Transit planning, in which the overall cost of the public transportation system highly depends on it. The main purpose of this study is to develop a novel solution methodology for the TrNDP, which goes beyond pervious traditional sophisticated approaches. The novelty of the solution methodology, adopted in this paper, stands on the deterministic operators which are tackled to construct bus routes. The deterministic manner of the TrNDP solution relies on using linear and integer mathematical formulations that can be solved exactly with their standard solvers. The solution methodology has been tested through Mandl’s benchmark network problem. The test results showed that the methodology developed in this research is able to improve the given network solution in terms of number of constructed routes, direct transit service coverage, transfer directness and solution reliability. Although the set of routes resulted from the methodology would stand alone as a final efficient solution for TrNDP, it could be used as an initial solution for meta-heuristic procedures to approach global optimal. Based on the presented methodology, a more robust network optimization tool would be produced for public transportation planning purposes.

Keywords: integer programming, transit route design, transportation, urban planning

Procedia PDF Downloads 272
3697 Atmospheric Fluid Bed Gasification of Different Biomass Fuels

Authors: Martin Lisý, Marek Baláš, Michal Špiláček, Zdeněk Skála

Abstract:

This paper shortly describes biomass types and growing amount in the Czech Republic. The considerable part of this paper deals with energy parameters of the most frequent utilizing biomass types and results of their gasification testing. There was chosen sixteen the most exploited "Czech" woody plants and grasses. There were determinated raw, element and biochemical analysis, basic calorimetric values, ash composition and ash characteristic temperatures. After that, each biofuel was tested by fluid bed gasification. The essential part of this paper yields results of chosen biomass types gasification experiments. Partly, there are described an operating conditions in detail with accentuation of individual fuels particularities partly, there is summarized gas composition and impurities content. The essential difference was determined mainly between woody plants and grasses both from point of view of the operating conditions and gas quality. The woody plants was evaluated as more suitable fuels for fluid bed gasifiers. This results will be able to significantly help with decision which energy plants are suitable for growing or with optimal biomass-treatment technology selection.

Keywords: biomass growing, biomass types, gasification, biomass fuels

Procedia PDF Downloads 570
3696 Design of Geochemical Maps of Industrial City Using Gradient Boosting and Geographic Information System

Authors: Ruslan Safarov, Zhanat Shomanova, Yuri Nossenko, Zhandos Mussayev, Ayana Baltabek

Abstract:

Geochemical maps of distribution of polluting elements V, Cr, Mn, Co, Ni, Cu, Zn, Mo, Cd, Pb on the territory of the Pavlodar city (Kazakhstan), which is an industrial hub were designed. The samples of soil were taken from 100 locations. Elemental analysis has been performed using XRF. The obtained data was used for training of the computational model with gradient boosting algorithm. The optimal parameters of model as well as the loss function were selected. The computational model was used for prediction of polluting elements concentration for 1000 evenly distributed points. Based on predicted data geochemical maps were created. Additionally, the total pollution index Zc was calculated for every from 1000 point. The spatial distribution of the Zc index was visualized using GIS (QGIS). It was calculated that the maximum coverage area of the territory of the Pavlodar city belongs to the moderately hazardous category (89.7%). The visualization of the obtained data allowed us to conclude that the main source of contamination goes from the industrial zones where the strategic metallurgical and refining plants are placed.

Keywords: Pavlodar, geochemical map, gradient boosting, CatBoost, QGIS, spatial distribution, heavy metals

Procedia PDF Downloads 80
3695 Energy Efficiency of Secondary Refrigeration with Phase Change Materials and Impact on Greenhouse Gases Emissions

Authors: Michel Pons, Anthony Delahaye, Laurence Fournaison

Abstract:

Secondary refrigeration consists of splitting large-size direct-cooling units into volume-limited primary cooling units complemented by secondary loops for transporting and distributing cold. Such a design reduces the refrigerant leaks, which represents a source of greenhouse gases emitted into the atmosphere. However, inserting the secondary circuit between the primary unit and the ‘users’ heat exchangers (UHX) increases the energy consumption of the whole process, which induces an indirect emission of greenhouse gases. It is thus important to check whether that efficiency loss is sufficiently limited for the change to be globally beneficial to the environment. Among the likely secondary fluids, phase change slurries offer several advantages: they transport latent heat, they stabilize the heat exchange temperature, and the formerly evaporators still can be used as UHX. The temperature level can also be adapted to the desired cooling application. Herein, the slurry {ice in mono-propylene-glycol solution} (melting temperature Tₘ of 6°C) is considered for food preservation, and the slurry {mixed hydrate of CO₂ + tetra-n-butyl-phosphonium-bromide in aqueous solution of this salt + CO₂} (melting temperature Tₘ of 13°C) is considered for air conditioning. For the sake of thermodynamic consistency, the analysis encompasses the whole process, primary cooling unit plus secondary slurry loop, and the various properties of the slurries, including their non-Newtonian viscosity. The design of the whole process is optimized according to the properties of the chosen slurry and under explicit constraints. As a first constraint, all the units must deliver the same cooling power to the user. The other constraints concern the heat exchanges areas, which are prescribed, and the flow conditions, which prevent deposition of the solid particles transported in the slurry, and their agglomeration. Minimization of the total energy consumption leads to the optimal design. In addition, the results are analyzed in terms of exergy losses, which allows highlighting the couplings between the primary unit and the secondary loop. One important difference between the ice-slurry and the mixed-hydrate one is the presence of gaseous carbon dioxide in the latter case. When the mixed-hydrate crystals melt in the UHX, CO₂ vapor is generated at a rate that depends on the phase change kinetics. The flow in the UHX, and its heat and mass transfer properties are significantly modified. This effect has never been investigated before. Lastly, inserting the secondary loop between the primary unit and the users increases the temperature difference between the refrigerated space and the evaporator. This results in a loss of global energy efficiency, and therefore in an increased energy consumption. The analysis shows that this loss of efficiency is not critical in the first case (Tₘ = 6°C), while the second case leads to more ambiguous results, partially because of the higher melting temperature.The consequences in terms of greenhouse gases emissions are also analyzed.

Keywords: exergy, hydrates, optimization, phase change material, thermodynamics

Procedia PDF Downloads 129
3694 Optimization of Monascus Orange Pigments Production Using pH-Controlled Fed-Batch Fermentation

Authors: Young Min Kim, Deokyeong Choe, Chul Soo Shin

Abstract:

Monascus pigments, commonly used as a natural colorant in Asia, have many biological activities, such as cholesterol level control, anti-obesity, anti-cancer, and anti-oxidant, that have recently been elucidated. Especially, amino acid derivatives of Monascus pigments are receiving much attention because they have higher biological activities than original Monascus pigments. Previously, there have been two ways to produce amino acid derivatives: one-step production and two-step production. However, the one-step production has low purity, and the two-step production—precursor(orange pigments) fermentation and derivatives synthesis—has low productivity and growth rate during its precursor fermentation step. In this study, it was verified that pH is a key factor that affects the stability of orange pigments and the growth rate of Monascus. With an optimal pH profile obtained by pH-stat fermentation, we designed a process of precursor(orange pigments) fermentation that is a pH-controlled fed-batch fermentation. The final concentration of orange pigments in this process increased to 5.5g/L which is about 30% higher than the concentration produced from the previously used precursor fermentation step.

Keywords: cultivation process, fed-batch fermentation, monascus pigments, pH stability

Procedia PDF Downloads 297
3693 Development of Concurrent Engineering through the Application of Software Simulations of Metal Production Processing and Analysis of the Effects of Application

Authors: D. M. Eric, D. Milosevic, F. D. Eric

Abstract:

Concurrent engineering technologies are a modern concept in manufacturing engineering. One of the key goals in designing modern technological processes is further reduction of production costs, both in the prototype and the preparatory part, as well as during the serial production. Thanks to many segments of concurrent engineering, these goals can be accomplished much more easily. In this paper, we give an overview of the advantages of using modern software simulations in relation to the classical aspects of designing technological processes of metal deformation. Significant savings are achieved thanks to the electronic simulation and software detection of all possible irregularities in the functional-working regime of the technological process. In order for the expected results to be optimal, it is necessary that the input parameters are very objective and that they reliably represent the values ​of these parameters in real conditions. Since it is a metal deformation treatment here, the particularly important parameters are the coefficient of internal friction between the working material and the tools, as well as the parameters related to the flow curve of the processing material. The paper will give a presentation for the experimental determination of some of these parameters.

Keywords: production technologies, metal processing, software simulations, effects of application

Procedia PDF Downloads 234
3692 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study

Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi

Abstract:

Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.

Keywords: travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering

Procedia PDF Downloads 427
3691 Imputing Missing Data in Electronic Health Records: A Comparison of Linear and Non-Linear Imputation Models

Authors: Alireza Vafaei Sadr, Vida Abedi, Jiang Li, Ramin Zand

Abstract:

Missing data is a common challenge in medical research and can lead to biased or incomplete results. When the data bias leaks into models, it further exacerbates health disparities; biased algorithms can lead to misclassification and reduced resource allocation and monitoring as part of prevention strategies for certain minorities and vulnerable segments of patient populations, which in turn further reduce data footprint from the same population – thus, a vicious cycle. This study compares the performance of six imputation techniques grouped into Linear and Non-Linear models on two different realworld electronic health records (EHRs) datasets, representing 17864 patient records. The mean absolute percentage error (MAPE) and root mean squared error (RMSE) are used as performance metrics, and the results show that the Linear models outperformed the Non-Linear models in terms of both metrics. These results suggest that sometimes Linear models might be an optimal choice for imputation in laboratory variables in terms of imputation efficiency and uncertainty of predicted values.

Keywords: EHR, machine learning, imputation, laboratory variables, algorithmic bias

Procedia PDF Downloads 83
3690 New Approach for Minimizing Wavelength Fragmentation in Wavelength-Routed WDM Networks

Authors: Sami Baraketi, Jean Marie Garcia, Olivier Brun

Abstract:

Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods.

Keywords: WDM, lightpath, RWA, wavelength fragmentation, optimization, linear programming, heuristic

Procedia PDF Downloads 526
3689 Lessons-Learned in a Post-Alliance Framework

Authors: Olubukola Olumuyiwa Tokede, Dominic D. Ahiaga-Dagbui, John Morrison

Abstract:

The project environment in construction has been widely criticised for its inability to learn from experience effectively. As each project is bespoke, learning is ephemeral, as it is often confined within its bounds and seldom assimilated with others that are being delivered in the project environment. To engender learning across construction projects, collaborative contractual arrangements, such as alliancing and partnering, have been embraced to aid the transferability of lessons across projects. These cooperative arrangements, however, tend to be costly, and hence construction organisations could revert to less expensive traditional procurement approaches after successful collaborative project delivery. This research, therefore, seeks to assess the lessons-learned in a post-alliance contractual framework. Using a case-study approach, we examine the experiences of a public sector authority who engaged a project facilitator to foster learning during the delivery of a significant piece of critical infrastructure. It was found that the facilitator enabled optimal learning outcomes in post-alliance contractual frameworks by attenuating the otherwise adversarial relationship between clients and contractors. Further research will seek to assess the effectiveness of different knowledge-brokering agencies in construction projects.

Keywords: facilitation, knowledge-brokering, learning, projects

Procedia PDF Downloads 136
3688 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis

Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame

Abstract:

Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.

Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain

Procedia PDF Downloads 85
3687 Improvement of Bone Scintography Image Using Image Texture Analysis

Authors: Yousif Mohamed Y. Abdallah, Eltayeb Wagallah

Abstract:

Image enhancement allows the observer to see details in images that may not be immediately observable in the original image. Image enhancement is the transformation or mapping of one image to another. The enhancement of certain features in images is accompanied by undesirable effects. To achieve maximum image quality after denoising, a new, low order, local adaptive Gaussian scale mixture model and median filter were presented, which accomplishes nonlinearities from scattering a new nonlinear approach for contrast enhancement of bones in bone scan images using both gamma correction and negative transform methods. The usual assumption of a distribution of gamma and Poisson statistics only lead to overestimation of the noise variance in regions of low intensity but to underestimation in regions of high intensity and therefore to non-optional results. The contrast enhancement results were obtained and evaluated using MatLab program in nuclear medicine images of the bones. The optimal number of bins, in particular the number of gray-levels, is chosen automatically using entropy and average distance between the histogram of the original gray-level distribution and the contrast enhancement function’s curve.

Keywords: bone scan, nuclear medicine, Matlab, image processing technique

Procedia PDF Downloads 505
3686 DCASH: Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y Synchronizing Mobile Database Systems

Authors: Gunasekaran Raja, Kottilingam Kottursamy, Rajakumar Arul, Ramkumar Jayaraman, Krithika Sairam, Lakshmi Ravi

Abstract:

The synchronization server maintains a dynamically changing cache, which contains the data items which were requested and collected by the mobile node from the server. The order and presence of tuples in the cache changes dynamically according to the frequency of updates performed on the data, by the server and client. To synchronize, the data which has been modified by client and the server at an instant are collected, batched together by the type of modification (insert/ update/ delete), and sorted according to their update frequencies. This ensures that the DCASH (Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y synchronizing Mobile Database Systems) gives priority to the frequently accessed data with high usage. The optimal memory management algorithm is proposed to manage data items according to their frequency, theorems were written to show the current mobile data activity is reverse Y in nature and the experiments were tested with 2g and 3g networks for various mobile devices to show the reduced response time and energy consumption.

Keywords: mobile databases, synchronization, cache, response time

Procedia PDF Downloads 404
3685 Holistic Urban Development: Incorporating Both Global and Local Optimization

Authors: Christoph Opperer

Abstract:

The rapid urbanization of modern societies and the need for sustainable urban development demand innovative solutions that meet both individual and collective needs while addressing environmental concerns. To address these challenges, this paper presents a study that explores the potential of spatial and energetic/ecological optimization to enhance the performance of urban settlements, focusing on both architectural and urban scales. The study focuses on the application of biological principles and self-organization processes in urban planning and design, aiming to achieve a balance between ecological performance, architectural quality, and individual living conditions. The research adopts a case study approach, focusing on a 10-hectare brownfield site in the south of Vienna. The site is surrounded by a small-scale built environment as an appropriate starting point for the research and design process. However, the selected urban form is not a prerequisite for the proposed design methodology, as the findings can be applied to various urban forms and densities. The methodology used in this research involves dividing the overall building mass and program into individual small housing units. A computational model has been developed to optimize the distribution of these units, considering factors such as solar exposure/radiation, views, privacy, proximity to sources of disturbance (such as noise), and minimal internal circulation areas. The model also ensures that existing vegetation and buildings on the site are preserved and incorporated into the optimization and design process. The model allows for simultaneous optimization at two scales, architectural and urban design, which have traditionally been addressed sequentially. This holistic design approach leads to individual and collective benefits, resulting in urban environments that foster a balance between ecology and architectural quality. The results of the optimization process demonstrate a seemingly random distribution of housing units that, in fact, is a densified hybrid between traditional garden settlements and allotment settlements. This urban typology is selected due to its compatibility with the surrounding urban context, although the presented methodology can be extended to other forms of urban development and density levels. The benefits of this approach are threefold. First, it allows for the determination of ideal housing distribution that optimizes solar radiation for each building density level, essentially extending the concept of sustainable building to the urban scale. Second, the method enhances living quality by considering the orientation and positioning of individual functions within each housing unit, achieving optimal views and privacy. Third, the algorithm's flexibility and robustness facilitate the efficient implementation of urban development with various stakeholders, architects, and construction companies without compromising its performance. The core of the research is the application of global and local optimization strategies to create efficient design solutions. By considering both, the performance of individual units and the collective performance of the urban aggregation, we ensure an optimal balance between private and communal benefits. By promoting a holistic understanding of urban ecology and integrating advanced optimization strategies, our methodology offers a sustainable and efficient solution to the challenges of modern urbanization.

Keywords: sustainable development, self-organization, ecological performance, solar radiation and exposure, daylight, visibility, accessibility, spatial distribution, local and global optimization

Procedia PDF Downloads 63
3684 Assessment of Solar Hydrogen Production in Energetic Hybrid PV-PEMFC System

Authors: H. Rezzouk, M. Hatti, H. Rahmani, S. Atoui

Abstract:

This paper discusses the design and analysis of a hybrid PV-Fuel cell energy system destined to power a DC load. The system is composed of a photovoltaic array, a fuel cell, an electrolyzer and a hydrogen tank. HOMER software is used in this study to calculate the optimum capacities of the power system components that their combination allows an efficient use of solar resource to cover the hourly load needs. The optimal system sizing allows establishing the right balance between the daily electrical energy produced by the power system and the daily electrical energy consumed by the DC load using a 28 KW PV array, a 7.5 KW fuel cell, a 40KW electrolyzer and a 270 Kg hydrogen tank. The variation of powers involved into the DC bus of the hybrid PV-fuel cell system has been computed and analyzed for each hour over one year: the output powers of the PV array and the fuel cell, the input power of the elctrolyzer system and the DC primary load. Equally, the annual variation of stored hydrogen produced by the electrolyzer has been assessed. The PV array contributes in the power system with 82% whereas the fuel cell produces 18%. 38% of the total energy consumption belongs to the DC primary load while the rest goes to the electrolyzer.

Keywords: electrolyzer, hydrogen, hydrogen fueled cell, photovoltaic

Procedia PDF Downloads 490
3683 A 3D Model of the Sustainable Management of the Natural Environment in National Parks

Authors: Paolo Russu

Abstract:

This paper investigates the economic and ecological dynamics that emerge in Protected Areas (PAs) as a result of interactions between visitors to the area and the animals that live there. We suppose that the PAs contain two species whose interactions are determined by the Lotka-Volterra equations system. Visitors' decisions to visit PAs are influenced by the entrance cost required to enter the park as well as the chance of witnessing the species that live there. Visitors have contradictory effects on the species and thus on the sustainability of the protected areas: on the one hand, an increase in the number of tourists damages the natural habitat of the areas and thus the species living there; on the other hand, it increases the total amount of entrance fees that the managing body of the PAs can use to perform defensive expenditures that protect the species from extinction. For a given set of parameter values, the existence of saddle-node bifurcation, Hopf bifurcation, homoclinic orbits, and a Bogdanov–Takens bifurcation of codimension two has been investigated. The system displays periodic doubling and chaotic solutions, as demonstrated by numerical examples. Pontryagin's Maximum Principle was utilized to develop an optimal admission charge policy that maximized both social gain and ecosystem conservation.

Keywords: environmental preferences, singularities point, dynamical system, chaos

Procedia PDF Downloads 95
3682 CFD Study of Free Surface Flows Resulting from a Dam-Breaking

Authors: Sonia Ben Hamza, Sabra Habli, Nejla Mahjoub Saïd, Hervé Bournot, Georges Le Palec

Abstract:

Free surface flows caused by dam breaks in channels or rivers is an attention-getting subject to the engineering practice, however, the studies are few to be reported. In this paper, a numerical investigation of unsteady free surface flows resulting from a dam-breaking in a rectangular channel is studied. Numerical computations were carried out using ANSYS Fluent which is based on the finite volume approach. The air/water interface was modeled with the volume of fluid method (VOF). Verification for a typical dam-break problem is analyzed by comparing the present results with others and very good agreement is obtained. The present approach is then used to predict the characteristics of free surface flow due to the dam breaking in channel. The characteristics of complex unsteady free surface flow in these examples are clearly explained. The numerical results show that the flow became more disturbed after impacting the vertical wall, then a recirculation zone, as well as turbulence phenomena, were created. At this instant, a cavity of air was included on the flow. The results agree well with the experimental data found in the literature.

Keywords: CFD, dam-break, free surface, turbulent flows, VOF

Procedia PDF Downloads 307
3681 Non-Invasive Imaging of Tissue Using Near Infrared Radiations

Authors: Ashwani Kumar Aggarwal

Abstract:

NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.

Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering

Procedia PDF Downloads 314
3680 Optimal Designof Brush Roll for Semiconductor Wafer Using CFD Analysis

Authors: Byeong-Sam Kim, Kyoungwoo Park

Abstract:

This research analyzes structure of flat panel display (FPD) such as LCD as quantitative through CFD analysis and modeling change to minimize the badness rate and rate of production decrease by damage of large scale plater at wafer heating chamber at semi-conductor manufacturing process. This glass panel and wafer device with atmospheric pressure or chemical vapor deposition equipment for transporting and transferring wafers, robot hands carry these longer and wider wafers can also be easily handled. As a contact handling system composed of several problems in increased potential for fracture or warping. A non-contact handling system is required to solve this problem. The panel and wafer warping makes it difficult to carry out conventional contact to analysis. We propose a new non-contact transportation system with combining air suction and blowout. The numerical analysis and experimental is, therefore, should be performed to obtain compared to results achieved with non-contact solutions. This wafer panel noncontact handler shows its strength in maintaining high cleanliness levels for semiconductor production processes.

Keywords: flat panel display, non contact transportation, heat treatment process, CFD analysis

Procedia PDF Downloads 415
3679 Use the Null Space to Create Starting Point for Stochastic Programming

Authors: Ghussoun Al-Jeiroudi

Abstract:

Stochastic programming is one of the powerful technique which is used to solve real-life problems. Hence, the data of real-life problems is subject to significant uncertainty. Uncertainty is well studied and modeled by stochastic programming. Each day, problems become bigger and bigger and the need for a tool, which does deal with large scale problems, increase. Interior point method is a perfect tool to solve such problems. Interior point method is widely employed to solve the programs, which arise from stochastic programming. It is an iterative technique, so it is required a starting point. Well design starting point plays an important role in improving the convergence speed. In this paper, we propose a starting point for interior point method for multistage stochastic programming. Usually, the optimal solution of stage k+1 is used as starting point for the stage k. This point has the advantage of being close to the solution of the current program. However, it has a disadvantage; it is not in the feasible region of the current program. So, we suggest to take this point and modifying it. That is by adding to it a vector in the null space of the matrix of the unchanged constraints because the solution will change only in the null space of this matrix.

Keywords: interior point methods, stochastic programming, null space, starting points

Procedia PDF Downloads 416
3678 Training Program for Kindergarden Teachers on Learning through Project Approach

Authors: Dian Hartiningsih, Miranda Diponegoro, Evita Eddie Singgih

Abstract:

In facing the 21st century, children need to be prepared in reaching their optimum development level which encompasses all aspect of growth and to achieve the learning goals which include not only knowledge and skill, but also disposition and feeling. Teachers as the forefront of education need to be equipped with the understanding and skill of a learning method which can prepare the children to face this 21st century challenge. Project approach is an approach which utilizes active learning which is beneficial for the children. Subject to this research are kindergarten teachers at Dwi Matra Kindergarten and Kirana Preschool. This research is a quantitative research using before and after study design. The result suggest that through preliminary training program on learning with project approach, the kindergarten teachers ability to explain project approach including understanding, benefit and stages of project approach have increased significantly, the teachers ability to design learning with project approach have also improved significantly. The result of learning design that the teachers had made shows a remarkable result for the first stage of the project approach; however the second and third design result was not as optimal. Challenges faced in the research will be elaborated further in the research discussion.

Keywords: project approach, teacher training, learning method, kindergarten

Procedia PDF Downloads 329
3677 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots

Authors: Meng Wu

Abstract:

Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.

Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization

Procedia PDF Downloads 137
3676 Assessment for the Backfill Using the Run of the Mine Tailings and Portland Cement

Authors: Javad Someehneshin, Weizhou Quan, Abdelsalam Abugharara, Stephen Butt

Abstract:

Narrow vein mining (NVM) is exploiting very thin but valuable ore bodies that are uneconomical to extract by conventional mining methods. NVM applies the technique of Sustainable Mining by Drilling (SMD). The SMD method is used to mine stranded, steeply dipping ore veins, which are too small or isolated to mine economically using conventional methods since the dilution is minimized. This novel mining technique uses drilling rigs to extract the ore through directional drilling surgically. This paper is focusing on utilizing the run of the mine tailings and Portland cement as backfill material to support the hanging wall for providing safe mine operation. Cemented paste backfill (CPB) is designed by mixing waste tailings, water, and cement of the precise percentage for optimal outcomes. It is a non-homogenous material that contains 70-85% solids. Usually, a hydraulic binder is added to the mixture to increase the strength of the CPB. The binder fraction mostly accounts for 2–10% of the total weight. In the mining industry, CPB has been improved and expanded gradually because it provides safety and support for the mines. Furthermore, CPB helps manage the waste tailings in an economical method and plays a significant role in environmental protection.

Keywords: backfilling, cement backfill, tailings, Portland cement

Procedia PDF Downloads 138