Search results for: adaptive choice-based conjoint analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28537

Search results for: adaptive choice-based conjoint analysis

28087 Adaptive Responses of Carum copticum to in vitro Salt Stress

Authors: R. Razavizadeh, F. Adabavazeh, M. Rezaee Chermahini

Abstract:

Salinity is one of the most widespread agricultural problems in arid and semi-arid areas that limits the plant growth and crop productivity. In this study, the salt stress effects on protein, reducing sugar, proline contents and antioxidant enzymes activities of Carum copticum L. under in vitro conditions were studied. Seeds of C. copticum were cultured in Murashige and Skoog (MS) medium containing 0, 25, 50, 100 and 150 mM NaCl and calli were cultured in MS medium containing 1 μM 2, 4-dichlorophenoxyacetic acid, 4 μM benzyl amino purine and different levels of NaCl (0, 25, 50, 100 and 150 mM). After NaCl treatment for 28 days, the proline and reducing sugar contents of shoots, roots and calli increased significantly in relation to the severity of the salt stress. The highest amount of proline and carbohydrate were observed at 150 and 100 mM NaCl, respectively. The reducing sugar accumulation in shoots was the highest as compared to roots, whereas, proline contents did not show any significant difference in roots and shoots under salt stress. The results showed significant reduction of protein contents in seedlings and calli. Based on these results, proteins extracted from the shoots, roots and calli of C. copticum treated with 150 mM NaCl showed the lowest contents. The positive relationships were observed between activity of antioxidant enzymes and the increase in stress levels. Catalase, ascorbate peroxidase and superoxide dismutase activity increased significantly under salt concentrations in comparison to the control. These results suggest that the accumulation of proline and sugars, and activation of antioxidant enzymes play adaptive roles in the adaptation of seedlings and callus of C. copticum to saline conditions.

Keywords: antioxidant enzymes, Carum copticum, organic solutes, salt stress

Procedia PDF Downloads 279
28086 NK Cells Expansion Model from PBMC Led to a Decrease of CD4+ and an Increase of CD8+ and CD25+CD127- T-Reg Lymphocytes in Patients with Ovarian Neoplasia

Authors: Rodrigo Fernandes da Silva, Daniela Maira Cardozo, Paulo Cesar Martins Alves, Sophie Françoise Derchain, Fernando Guimarães

Abstract:

T-reg lymphocytes are important for the control of peripheral tolerance. They control the adaptive immune system and prevent autoimmunity through its suppressive action on CD4+ and CD8+ lymphocytes. The suppressive action also includes B lymphocytes, dendritic cells, monocytes/macrophages and recently, studies have shown that T-reg are also able to inhibit NK cells, therefore they exert their control of the immune response from innate to adaptive response. Most tumors express self-ligands, therefore it is believed that T-reg cells induce tolerance of the immune system, hindering the development of successful immunotherapies. T-reg cells have been linked to the suppression mechanisms of the immune response against tumors, including ovarian cancer. The goal of this study was to disclose the sub-population of the expanded CD3+ lymphocytes reported by previous studies, using the long-term culture model designed by Carlens et al 2001, to generate effector cell suspensions enriched with cytotoxic CD3-CD56+ NK cells, from PBMC of ovarian neoplasia patients. Methods and Results: Blood was collected from 12 patients with ovarian neoplasia after signed consent: 7 benign (Bng) and 5 malignant (Mlg). Mononuclear cells were separated by Ficoll-Paque gradient. Long-term culture was conducted by a 21 day culturing process with SCGM CellGro medium supplemented with anti-CD3 (10ng/ml, first 5 days), IL-2 (1000UI/ml) and FBS (10%). After 21 days of expansion, there was an increase in the population of CD3+ lymphocytes in the benign and malignant group. Within CD3+ population, there was a significant decrease in the population of CD4+ lymphocytes in the benign (median Bgn D-0=73.68%, D-21=21.05%) (p<0.05) and malignant (median Mlg D-0=64.00%, D-21=11.97%) (p < 0.01) group. Inversely, after 21 days of expansion, there was an increase in the population of CD8+ lymphocytes within the CD3+ population in the benign (median Bgn D-0=16.80%, D-21=38.56%) and malignant (median Mlg D-0=27.12%, D-21=72.58%) group. However, this increase was only significant on the malignant group (p<0.01). Within the CD3+CD4+ population, there was a significant increase (p < 0.05) in the population of T-reg lymphocytes in the benign (median Bgn D-0=9.84%, D-21=39.47%) and malignant (median Mlg D-0=3.56%, D-21=16.18%) group. Statistical analysis inter groups was performed by Kruskal-Wallis test and intra groups by Mann Whitney test. Conclusion: The CD4+ and CD8+ sub-population of CD3+ lymphocytes shifts with the culturing process. This might be due to the process of the immune system to produce a cytotoxic response. At the same time, T-reg lymphocytes increased within the CD4+ population, suggesting a modulation of the immune response towards cells of the immune system. The expansion of the T-reg population can hinder an immune response against cancer. Therefore, an immunotherapy using this expansion procedure should aim to halt the expansion of T-reg or its immunosuppresion capability.

Keywords: regulatory T cells, CD8+ T cells, CD4+ T cells, NK cell expansion

Procedia PDF Downloads 450
28085 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform

Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo

Abstract:

The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.

Keywords: energy-efficient, fog computing, IoT, telehealth

Procedia PDF Downloads 86
28084 Recent Advancement in Fetal Electrocardiogram Extraction

Authors: Savita, Anurag Sharma, Harsukhpreet Singh

Abstract:

Fetal Electrocardiogram (fECG) is a widely used technique to assess the fetal well-being and identify any changes that might be with problems during pregnancy and to evaluate the health and conditions of the fetus. Various techniques or methods have been employed to diagnose the fECG from abdominal signal. This paper describes the facile approach for the estimation of the fECG known as Adaptive Comb. Filter (ACF). The ACF can adjust according to the temporal variations in fundamental frequency by itself that used for the estimation of the quasi periodic signal of ECG signal.

Keywords: aECG, ACF, fECG, mECG

Procedia PDF Downloads 406
28083 On the Solution of Fractional-Order Dynamical Systems Endowed with Block Hybrid Methods

Authors: Kizito Ugochukwu Nwajeri

Abstract:

This paper presents a distinct approach to solving fractional dynamical systems using hybrid block methods (HBMs). Fractional calculus extends the concept of derivatives and integrals to non-integer orders and finds increasing application in fields such as physics, engineering, and finance. However, traditional numerical techniques often struggle to accurately capture the complex behaviors exhibited by these systems. To address this challenge, we develop HBMs that integrate single-step and multi-step methods, enabling the simultaneous computation of multiple solution points while maintaining high accuracy. Our approach employs polynomial interpolation and collocation techniques to derive a system of equations that effectively models the dynamics of fractional systems. We also directly incorporate boundary and initial conditions into the formulation, enhancing the stability and convergence properties of the numerical solution. An adaptive step-size mechanism is introduced to optimize performance based on the local behavior of the solution. Extensive numerical simulations are conducted to evaluate the proposed methods, demonstrating significant improvements in accuracy and efficiency compared to traditional numerical approaches. The results indicate that our hybrid block methods are robust and versatile, making them suitable for a wide range of applications involving fractional dynamical systems. This work contributes to the existing literature by providing an effective numerical framework for analyzing complex behaviors in fractional systems, thereby opening new avenues for research and practical implementation across various disciplines.

Keywords: fractional calculus, numerical simulation, stability and convergence, Adaptive step-size mechanism, collocation methods

Procedia PDF Downloads 41
28082 Enhancing Power System Resilience: An Adaptive Under-Frequency Load Shedding Scheme Incorporating PV Generation and Fast Charging Stations

Authors: Sami M. Alshareef

Abstract:

In the rapidly evolving energy landscape, the integration of renewable energy sources and the electrification of transportation are essential steps toward achieving sustainability goals. However, these advancements introduce new challenges, particularly in maintaining frequency stability due to variable photovoltaic (PV) generation and the growing demand for fast charging stations. The variability of photovoltaic (PV) generation due to weather conditions can disrupt the balance between generation and load, resulting in frequency deviations. To ensure the stability of power systems, it is imperative to develop effective under frequency load-shedding schemes. This research proposal presents an adaptive under-frequency load shedding scheme based on the power swing equation, designed explicitly for the IEEE-9 Bus Test System, that includes PV generation and fast charging stations. This research aims to address these challenges by developing an advanced scheme that dynamically disconnects fast charging stations based on power imbalances. The scheme prioritizes the disconnection of stations near affected areas to expedite system frequency stabilization. To achieve these goals, the research project will leverage the power swing equation, a widely recognized model for analyzing system dynamics during under-frequency events. By utilizing this equation, the proposed scheme will adaptively adjust the load-shedding process in real-time to maintain frequency stability and prevent power blackouts. The research findings will support the transition towards sustainable energy systems by ensuring a reliable and uninterrupted electricity supply while enhancing the resilience and stability of power systems during under-frequency events.

Keywords: load shedding, fast charging stations, pv generation, power system resilience

Procedia PDF Downloads 79
28081 The Continuous Facility Location Problem and Transportation Mode Selection in the Supply Chain under Sustainability

Authors: Abdulaziz Alageel, Martino Luis, Shuya Zhong

Abstract:

The main focus of this research study is on the challenges faced in decision-making in a supply chain network regarding the facility location while considering carbon emissions. The study aims (i) to locate facilities (i.e., distribution centeres) in a continuous space considering limitations of capacity and the costs associated with opening and (ii) to reduce the cost of carbon emissions by selecting the mode of transportation. The problem is formulated as mixed-integer linear programming. This study hybridised a greedy randomised adaptive search (GRASP) and variable neighborhood search (VNS) to deal with the problem. Well-known datasets from the literature (Brimberg et al. 2001) are used and adapted in order to assess the performance of the proposed method. The proposed hybrid method produces encouraging results based on computational analysis. The study also highlights some research avenues for future recommendations.

Keywords: supply chain, facility location, weber problem, sustainability

Procedia PDF Downloads 98
28080 Understanding and Explaining Urban Resilience and Vulnerability: A Framework for Analyzing the Complex Adaptive Nature of Cities

Authors: Richard Wolfel, Amy Richmond

Abstract:

Urban resilience and vulnerability are critical concepts in the modern city due to the increased sociocultural, political, economic, demographic, and environmental stressors that influence current urban dynamics. Urban scholars need help explaining urban resilience and vulnerability. First, cities are dominated by people, which is challenging to model, both from an explanatory and a predictive perspective. Second, urban regions are highly recursive in nature, meaning they not only influence human action, but the structures of cities are constantly changing due to human actions. As a result, explanatory frameworks must continuously evolve as humans influence and are influenced by the urban environment in which they operate. Finally, modern cities have populations, sociocultural characteristics, economic flows, and environmental impacts on order of magnitude well beyond the cities of the past. As a result, the frameworks that seek to explain the various functions of a city that influence urban resilience and vulnerability must address the complex adaptive nature of cities and the interaction of many distinct factors that influence resilience and vulnerability in the city. This project develops a taxonomy and framework for organizing and explaining urban vulnerability. The framework is built on a well-established political development model that includes six critical classes of urban dynamics: political presence, political legitimacy, political participation, identity, production, and allocation. In addition, the framework explores how environmental security and technology influence and are influenced by the six elements of political development. The framework aims to identify key tipping points in society that act as influential agents of urban vulnerability in a region. This will help analysts and scholars predict and explain the influence of both physical and human geographical stressors in a dense urban area.

Keywords: urban resilience, vulnerability, sociocultural stressors, political stressors

Procedia PDF Downloads 115
28079 A Temporary Shelter Proposal for Displaced People

Authors: İrem Yetkin, Feray Maden, Seda Tosun, Yenal Akgün, Özgür Kilit, Koray Korkmaz, Gökhan Kiper, Mustafa Gündüzalp

Abstract:

Forced migration, whether caused by conflicts or other factors, frequently places individuals in vulnerable situations, necessitating immediate access to shelter. To promptly address the immediate needs of affected individuals, temporary shelters are often established. These shelters are characterized by their adaptable and functional nature, encompassing lightweight and sustainable structural systems, rapid assembly capabilities, modularity, and transportability. The shelter design is contingent upon demand, resulting in distinct phases for different structural forms. A multi-phased shelter approach covers emergency response, temporary shelter, and permanent reconstruction. Emergency shelters play a critical role in providing immediate life-saving aid, while temporary and transitional shelters, which are also called “t-shelters,” offer longer-term living environments during the recovery and rebuilding phases. Among these, temporary shelters are more extensively covered in the literature due to their diverse inhabiting functions. The roles of emergency shelters and temporary shelters are inherently separate, addressing distinct aspects of sheltering processes. Given their prolonged usage, temporary shelters are built for greater durability compared to emergency shelters. Nonetheless, inadequacies in temporary shelters can lead to challenges in ensuring habitability. Issues like non-expandable structures unsuitable for accommodating large families, the use of short-term shelters that worsen conditions, non-waterproof materials providing insufficient protection against bad weather conditions, and complex installation systems contribute to these problems. Given the aforementioned problems, there arises a need to develop adaptive shelters featuring lightweight components for ease of transport, possess the ability for rapid assembly, and utilize durable materials to withstand adverse weather conditions. In this study, first, the state-of-the-art on temporary shelters is presented. Then, an adaptive temporary shelter composed of foldable plates is proposed, which can easily be assembled and transportable. The proposed shelter is deliberated upon its movement capacity, transportability, and flexibility. This study makes a valuable contribution to the literature since it not only offers a systematic analysis of temporary shelters utilizing kinetic systems but also presents a practical solution that meets the necessary design requirements.

Keywords: deployable structures, foldable plates, forced migration, temporary shelters

Procedia PDF Downloads 70
28078 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe, V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: diabetic retinopathy, fundus images, STARE, Gabor filter, support vector machine

Procedia PDF Downloads 293
28077 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract:

Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.

Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates

Procedia PDF Downloads 305
28076 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 151
28075 Thermoregulatory Responses of Holstein Cows Exposed to Intense Heat Stress

Authors: Rodrigo De A. Ferrazza, Henry D. M. Garcia, Viviana H. V. Aristizabal, Camilla De S. Nogueira, Cecilia J. Verissimo, Jose Roberto Sartori, Roberto Sartori, Joao Carlos P. Ferreira

Abstract:

Environmental factors adversely influence sustainability in livestock production system. Dairy herds are the most affected by heat stress among livestock industries. This clearly implies in development of new strategies for mitigating heat, which should be based on physiological and metabolic adaptations of the animal. In this study, we incorporated the effect of climate variables and heat exposure time on the thermoregulatory responses in order to clarify the adaptive mechanisms for bovine heat dissipation under intense thermal stress induced experimentally in climate chamber. Non-lactating Holstein cows were contemporaneously and randomly assigned to thermoneutral (TN; n=12) or heat stress (HS; n=12) treatments during 16 days. Vaginal temperature (VT) was measured every 15 min with a microprocessor-controlled data logger (HOBO®, Onset Computer Corporation, Bourne, MA, USA) attached to a modified vaginal controlled internal drug release insert (Sincrogest®, Ourofino, Brazil). Rectal temperature (RT), respiratory rate (RR) and heart rate (HR) were measured twice a day (0700 and 1500h) and dry matter intake (DMI) was estimated daily. The ambient temperature and air relative humidity were 25.9±0.2°C and 73.0±0.8%, respectively for TN, and 36.3± 0.3°C and 60.9±0.9%, respectively for HS. Respiratory rate of HS cows increased immediately after exposure to heat and was higher (76.02±1.70bpm; P<0.001) than TN (39.70±0.71bpm), followed by rising of RT (39.87°C±0.07 for HS versus 38.56±0.03°C for TN; P<0.001) and VT (39.82±0.10°C for HS versus 38.26±0.03°C for TN; P<0.001). A diurnal pattern was detected, with higher (P<0.01) afternoon temperatures than morning and this effect was aggravated for HS cows. There was decrease (P<0.05) of HR for HS cows (62.13±0.99bpm) compared to TN (66.23±0.79bpm), but the magnitude of the differences was not the same over time. From the third day, there was a decrease of DMI for HS in attempt to maintain homeothermy, while TN cows increased DMI (8.27kg±0.33kg d-1 for HS versus 14.03±0.29kg d-1 for TN; P<0.001). By regression analysis, RT and RR better reflected the response of cows to changes in the Temperature Humidity Index and the effect of climate variables from the previous day to influence the physiological parameters and DMI was more important than the current day, with ambient temperature the most important factor. Comparison between acute (0 to 3 days) and chronic (13 to 16 days) exposure to heat stress showed decreasing of the slope of the regression equations for RR and DMI, suggesting an adaptive adjustment, however with no change for RT. In conclusion, intense heat stress exerted strong influence on the thermoregulatory mechanisms, but the acclimation process was only partial.

Keywords: acclimation, bovine, climate chamber, hyperthermia, thermoregulation

Procedia PDF Downloads 217
28074 Elasto-Plastic Analysis of Structures Using Adaptive Gaussian Springs Based Applied Element Method

Authors: Mai Abdul Latif, Yuntian Feng

Abstract:

Applied Element Method (AEM) is a method that was developed to aid in the analysis of the collapse of structures. Current available methods cannot deal with structural collapse accurately; however, AEM can simulate the behavior of a structure from an initial state of no loading until collapse of the structure. The elements in AEM are connected with sets of normal and shear springs along the edges of the elements, that represent the stresses and strains of the element in that region. The elements are rigid, and the material properties are introduced through the spring stiffness. Nonlinear dynamic analysis has been widely modelled using the finite element method for analysis of progressive collapse of structures; however, difficulties in the analysis were found at the presence of excessively deformed elements with cracking or crushing, as well as having a high computational cost, and difficulties on choosing the appropriate material models for analysis. The Applied Element method is developed and coded to significantly improve the accuracy and also reduce the computational costs of the method. The scheme works for both linear elastic, and nonlinear cases, including elasto-plastic materials. This paper will focus on elastic and elasto-plastic material behaviour, where the number of springs required for an accurate analysis is tested. A steel cantilever beam is used as the structural element for the analysis. The first modification of the method is based on the Gaussian Quadrature to distribute the springs. Usually, the springs are equally distributed along the face of the element, but it was found that using Gaussian springs, only up to 2 springs were required for perfectly elastic cases, while with equal springs at least 5 springs were required. The method runs on a Newton-Raphson iteration scheme, and quadratic convergence was obtained. The second modification is based on adapting the number of springs required depending on the elasticity of the material. After the first Newton Raphson iteration, Von Mises stress conditions were used to calculate the stresses in the springs, and the springs are classified as elastic or plastic. Then transition springs, springs located exactly between the elastic and plastic region, are interpolated between regions to strictly identify the elastic and plastic regions in the cross section. Since a rectangular cross-section was analyzed, there were two plastic regions (top and bottom), and one elastic region (middle). The results of the present study show that elasto-plastic cases require only 2 springs for the elastic region, and 2 springs for the plastic region. This showed to improve the computational cost, reducing the minimum number of springs in elasto-plastic cases to only 6 springs. All the work is done using MATLAB and the results will be compared to models of structural elements using the finite element method in ANSYS.

Keywords: applied element method, elasto-plastic, Gaussian springs, nonlinear

Procedia PDF Downloads 224
28073 Flat-Top Apodization of Laser Beams by Means of Acousto-Optics

Authors: Sergey I. Chizhikov, Vladimir Y. Molchanov, Konstantin B. Yushkov

Abstract:

We demonstrate a method for adaptive spatial shaping of laser beams by means of acousto-optic Bragg diffraction. Transformation of the angular spectrum during Bragg diffraction is used to convert Gaussian intensity distribution into a flat-top one. Theoretical model is supported by the experiment.

Keywords: acousto-optics, flat top, beam shaping, Bragg diffraction

Procedia PDF Downloads 624
28072 USTTB (UCRC) Financial Management, Strengths and Weaknesses

Authors: Samba Lamine Cisse, Cheick Oumar Tangara, Seynabou Sissoko, Mahamadou Diakite, Seydou Doumbia

Abstract:

Background: Financial management of a scientific research center is a crucial element in achieving ambitious scientific goals. It can be a driving force for research success, but it also has shortcomings that are important to understand. This study focuses on the crucial aspects of financial management in the context of scientific research centers, more specifically the USTTB (UCRC) in Mali in terms of strengths and weaknesses. Methodology: This study concerns the case of the UCRC, one of the USTTB's research centers. It is a qualitative study based on years of experience in project management at the USTTB, and on analyses and interpretations of everyday activities. Result: It offers practical recommendations for improving the financial stability of research institutions, thereby contributing to their mission of promoting scientific research and innovation. Scientific research centers play a crucial role in the development of knowledge, and their effective operation largely depends on the appropriate management of their financial resources. It begins with an in-depth analysis of UCRC's typical financial structure, highlighting its types and sources of funding, followed by an analysis of the strengths and weaknesses of its current financial management system. Conclusion: Financial management of a scientific research center is essential to ensure the continuity of research activities, the development of innovative projects and the achievement of scientific objectives. Adaptive financial management focused on efficiency, diversification of funding and risk control. They are essential to meeting these challenges and fostering excellence in scientific research.

Keywords: financial, management, strengths, weaknesses, recommendations

Procedia PDF Downloads 13
28071 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images

Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.

Keywords: diabetic retinopathy, fundus, CHT, exudates, hemorrhages

Procedia PDF Downloads 271
28070 Site Selection in Adaptive Reuse Architecture for Social Housing in Johannesburg, South Africa

Authors: Setapo Moloi, Jun-Ichiro Giorgos Tsutsumi

Abstract:

South Africa’s need for the provision of housing within its major city centres, specifically Gauteng Province (GP), is a major concern. Initiatives for converting misused/ unused buildings to suitable housing for residents who work in the city as well as prospective citizens are currently underway, one aspect that is needed currently, is the re-possession of these buildings repurposing, into housing communities for quality low cost mixed density housing and for this process to have minimal strain on existing infrastructure like energy, emission reduction etc. Unfortunately, there are instances in Johannesburg, the country’s economic capital, with 2017 estimates claiming that 700 buildings lay unused or misused due to issues that will be discussed in this paper, these then become hubs for illegal activity and are an unacceptable form of shelter. It can be argued that the provision of inner-city social housing is lacking, but not due to the unavailability of funding or usable land and buildings, but that these assets are not being used appropriately nor to their full potential. Currently the GP government has mandated the re-purposing of all buildings that meet their criteria (structural stability, feasibility, adaptability, etc.) with the intention of inviting interested parties to propose conversions of the buildings into densified social housing. Going forward, the proposed focus is creation of social housing communities within existing buildings which may be retrofitted with sustainable technologies, green design strategies and principles, aiming for the finished buildings to achieve ‘Net-Zero/Positive’ status. A Net-Zero building, according to The Green Building Council of South Africa (GBCSA) is a building which manages to produce resources it needs to function, and reduces wastage, emissions and demand of these resources during its lifespan. The categories which GBCSA includes are carbon, water, waste and ecology, this may include material selection, construction methods, etc.

Keywords: adaptive reuse, conversion, net-zero, social housing, sustainable communities

Procedia PDF Downloads 136
28069 Adaptive Response of Plants to Environmental Stress: Natural Oil Seepage; The Living Laboratory in Tramutola, Basilicata Region

Authors: Maria Francesca Scannone, Martina Bochicchio

Abstract:

One of the major environmental problems today is hydrocarbon contamination. The promising sustainable technologies for the treatment of these contaminated sites involves the use of biological organisms. In Agri Valley (Basilicata Region) there is a living laboratory (natural oil seeps) where the selective pressure has enriched the environmental matrices with microorganisms, fungi and plant species able to use the hydrocarbons as a source of metabolic energy, to degrade or tolerate hydrocarbons. Observers visiting this area are fascinated by its unspoiled nature, and the condition of the ecosystem does not appear to has been damaged. The amazing resiliency observed in Tramutola site is of key importance to try to bring green remediation technologies, but no research has been done to identify high-performing native species. The aim of this research was to study how natural processes affect the fate of released oil or how individual species or communities of plants and animals are capable of dealing with the burden of otherwise toxic chemicals. The survey of vegetation was carried out, more than 60 species have been identified and divided into tree, shrub and herb layer. Plant data sheets have been completed only for the species that showed the most appropriate properties for phytoremediation. In general, members of the Salicales, Cyperales, Poales, Fagales, Cornales, Equisetales orders were the most commonly identified orders. They are pioneer plants with high adaptive capacity and vegetative propagation. The literature review has highlighted the existence of rhizosphere effect and a green liver model on selected plants. The study provides significant information on the environmental stress adaptation processes of many indigenous plants that are living and growing on a natural leak of crude oil and gas that migrates up through subsurface.

Keywords: green liver, hydrocarbon degradation, oil seeps, phytoremediation

Procedia PDF Downloads 173
28068 A Transform Domain Function Controlled VSSLMS Algorithm for Sparse System Identification

Authors: Cemil Turan, Mohammad Shukri Salman

Abstract:

The convergence rate of the least-mean-square (LMS) algorithm deteriorates if the input signal to the filter is correlated. In a system identification problem, this convergence rate can be improved if the signal is white and/or if the system is sparse. We recently proposed a sparse transform domain LMS-type algorithm that uses a variable step-size for a sparse system identification. The proposed algorithm provided high performance even if the input signal is highly correlated. In this work, we investigate the performance of the proposed TD-LMS algorithm for a large number of filter tap which is also a critical issue for standard LMS algorithm. Additionally, the optimum value of the most important parameter is calculated for all experiments. Moreover, the convergence analysis of the proposed algorithm is provided. The performance of the proposed algorithm has been compared to different algorithms in a sparse system identification setting of different sparsity levels and different number of filter taps. Simulations have shown that the proposed algorithm has prominent performance compared to the other algorithms.

Keywords: adaptive filtering, sparse system identification, TD-LMS algorithm, VSSLMS algorithm

Procedia PDF Downloads 358
28067 Prophylactic Replacement of Voice Prosthesis: A Study to Predict Prosthesis Lifetime

Authors: Anne Heirman, Vincent van der Noort, Rob van Son, Marije Petersen, Lisette van der Molen, Gyorgy Halmos, Richard Dirven, Michiel van den Brekel

Abstract:

Objective: Voice prosthesis leakage significantly impacts laryngectomies patients' quality of life, causing insecurity and frequent unplanned hospital visits and costs. In this study, the concept of prophylactic voice prosthesis replacement was explored to prevent leakages. Study Design: A retrospective cohort study. Setting: Tertiary hospital. Methods: Device lifetimes and voice prosthesis replacements of a retrospective cohort, including all patients with laryngectomies between 2000 and 2012 in the Netherlands Cancer Institute, were used to calculate the number of needed voice prostheses per patient per year when preventing 70% of the leakages by prophylactic replacement. Various strategies for the timing of prophylactic replacement were considered: Adaptive strategies based on the individual patient’s history of replacement and fixed strategies based on the results of patients with similar voice prosthesis or treatment characteristics. Results: Patients used a median of 3.4 voice prostheses per year (range 0.1-48.1). We found a high inter-and intrapatient variability in device lifetime. When applying prophylactic replacement, this would become a median of 9.4 voice prostheses per year, which means replacement every 38 days, implying more than six additional voice prostheses per patient per year. The individual adaptive model showed that preventing 70% of the leakages was impossible for most patients, and only a median of 25% can be prevented. Monte-Carlo simulations showed that prophylactic replacement is not feasible due to the high Coefficient of Variation (Standard Deviation/Mean) in device lifetime. Conclusion: Based on our simulations, prophylactic replacement of voice prostheses is not feasible due to high inter-and intrapatient variation in device lifetime.

Keywords: voice prosthesis, voice rehabilitation, total laryngectomy, prosthetic leakage, device lifetime

Procedia PDF Downloads 127
28066 Development and Investigation of Efficient Substrate Feeding and Dissolved Oxygen Control Algorithms for Scale-Up of Recombinant E. coli Cultivation Process

Authors: Vytautas Galvanauskas, Rimvydas Simutis, Donatas Levisauskas, Vykantas Grincas, Renaldas Urniezius

Abstract:

The paper deals with model-based development and implementation of efficient control strategies for recombinant protein synthesis in fed-batch E.coli cultivation processes. Based on experimental data, a kinetic dynamic model for cultivation process was developed. This model was used to determine substrate feeding strategies during the cultivation. The proposed feeding strategy consists of two phases – biomass growth phase and recombinant protein production phase. In the first process phase, substrate-limited process is recommended when the specific growth rate of biomass is about 90-95% of its maximum value. This ensures reduction of glucose concentration in the medium, improves process repeatability, reduces the development of secondary metabolites and other unwanted by-products. The substrate limitation can be enhanced to satisfy restriction on maximum oxygen transfer rate in the bioreactor and to guarantee necessary dissolved carbon dioxide concentration in culture media. In the recombinant protein production phase, the level of substrate limitation and specific growth rate are selected within the range to enable optimal target protein synthesis rate. To account for complex process dynamics, to efficiently exploit the oxygen transfer capability of the bioreactor, and to maintain the required dissolved oxygen concentration, adaptive control algorithms for dissolved oxygen control have been proposed. The developed model-based control strategies are useful in scale-up of cultivation processes and accelerate implementation of innovative biotechnological processes for industrial applications.

Keywords: adaptive algorithms, model-based control, recombinant E. coli, scale-up of bioprocesses

Procedia PDF Downloads 255
28065 Linkages Between Climate Change, Agricultural Productivity, Food Security and Economic Growth

Authors: Jihène Khalifa

Abstract:

This study analyzed the relationships between Tunisia’s economic growth, food security, agricultural productivity, and climate change using the ARDL model for the period from 1990 to 2022. The ARDL model reveals a positive correlation between economic growth and lagged agricultural productivity. Additionally, the vector autoregressive (VAR) model highlights the beneficial impact of lagged agricultural productivity on economic growth and the negative effect of rainfall on economic growth. Granger causality analysis identifies unidirectional relationships from economic growth to agricultural productivity, crop production, food security, and temperature variations, as well as from temperature variations to crop production. Furthermore, a bidirectional causality is established between crop production and food security. The study underscores the impact of climate change on crop production and suggests the need for adaptive strategies to mitigate these climate effects.

Keywords: economic growth, agriculture, food security, climate change, ARDl, VAR

Procedia PDF Downloads 30
28064 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 283
28063 Determining the Most Efficient Test Available in Software Testing

Authors: Qasim Zafar, Matthew Anderson, Esteban Garcia, Steven Drager

Abstract:

Software failures can present an enormous detriment to people's lives and cost millions of dollars to repair when they are unexpectedly encountered in the wild. Despite a significant portion of the software development lifecycle and resources are dedicated to testing, software failures are a relatively frequent occurrence. Nevertheless, the evaluation of testing effectiveness remains at the forefront of ensuring high-quality software and software metrics play a critical role in providing valuable insights into quantifiable objectives to assess the level of assurance and confidence in the system. As the selection of appropriate metrics can be an arduous process, the goal of this paper is to shed light on the significance of software metrics by examining a range of testing techniques and metrics as well as identifying key areas for improvement. Additionally, through this investigation, readers will gain a deeper understanding of how metrics can help to drive informed decision-making on delivering high-quality software and facilitate continuous improvement in testing practices.

Keywords: software testing, software metrics, testing effectiveness, black box testing, random testing, adaptive random testing, combinatorial testing, fuzz testing, equivalence partition, boundary value analysis, white box testing

Procedia PDF Downloads 85
28062 Quantifying Meaning in Biological Systems

Authors: Richard L. Summers

Abstract:

The advanced computational analysis of biological systems is becoming increasingly dependent upon an understanding of the information-theoretic structure of the materials, energy and interactive processes that comprise those systems. The stability and survival of these living systems are fundamentally contingent upon their ability to acquire and process the meaning of information concerning the physical state of its biological continuum (biocontinuum). The drive for adaptive system reconciliation of a divergence from steady-state within this biocontinuum can be described by an information metric-based formulation of the process for actionable knowledge acquisition that incorporates the axiomatic inference of Kullback-Leibler information minimization driven by survival replicator dynamics. If the mathematical expression of this process is the Lagrangian integrand for any change within the biocontinuum then it can also be considered as an action functional for the living system. In the direct method of Lyapunov, such a summarizing mathematical formulation of global system behavior based on the driving forces of energy currents and constraints within the system can serve as a platform for the analysis of stability. As the system evolves in time in response to biocontinuum perturbations, the summarizing function then conveys information about its overall stability. This stability information portends survival and therefore has absolute existential meaning for the living system. The first derivative of the Lyapunov energy information function will have a negative trajectory toward a system's steady state if the driving force is dissipating. By contrast, system instability leading to system dissolution will have a positive trajectory. The direction and magnitude of the vector for the trajectory then serves as a quantifiable signature of the meaning associated with the living system’s stability information, homeostasis and survival potential.

Keywords: meaning, information, Lyapunov, living systems

Procedia PDF Downloads 130
28061 Pod and Wavelets Application for Aerodynamic Design Optimization

Authors: Bonchan Koo, Junhee Han, Dohyung Lee

Abstract:

The research attempts to evaluate the accuracy and efficiency of a design optimization procedure which combines wavelets-based solution algorithm and proper orthogonal decomposition (POD) database management technique. Aerodynamic design procedure calls for high fidelity computational fluid dynamic (CFD) simulations and the consideration of large number of flow conditions and design constraints. Even with significant computing power advancement, current level of integrated design process requires substantial computing time and resources. POD reduces the degree of freedom of full system through conducting singular value decomposition for various field simulations. For additional efficiency improvement of the procedure, adaptive wavelet technique is also being employed during POD training period. The proposed design procedure was applied to the optimization of wing aerodynamic performance. Throughout the research, it was confirmed that the POD/wavelets design procedure could significantly reduce the total design turnaround time and is also able to capture all detailed complex flow features as in full order analysis.

Keywords: POD (Proper Orthogonal Decomposition), wavelets, CFD, design optimization, ROM (Reduced Order Model)

Procedia PDF Downloads 463
28060 Exploring Tree Growth Variables Influencing Carbon Sequestration in the Face of Climate Change

Authors: Funmilayo Sarah Eguakun, Peter Oluremi Adesoye

Abstract:

One of the major problems being faced by human society is that the global temperature is believed to be rising due to human activity that releases carbon IV oxide (CO2) to the atmosphere. Carbon IV oxide is the most important greenhouse gas influencing global warming and possible climate change. With climate change becoming alarming, reducing CO2 in our atmosphere has become a primary goal of international efforts. Forest landsare major sink and could absorb large quantities of carbon if the trees are judiciously managed. The study aims at estimating the carbon sequestration capacity of Pinus caribaea (pine)and Tectona grandis (Teak) under the prevailing environmental conditions and exploring tree growth variables that influencesthe carbon sequestration capacity in Omo Forest Reserve, Ogun State, Nigeria. Improving forest management by manipulating growth characteristics that influences carbon sequestration could be an adaptive strategy of forestry to climate change. Random sampling was used to select Temporary Sample Plots (TSPs) in the study area from where complete enumeration of growth variables was carried out within the plots. The data collected were subjected to descriptive and correlational analyses. The results showed that average carbon stored by Pine and Teak are 994.4±188.3 Kg and 1350.7±180.6 Kg respectively. The difference in carbon stored in the species is significant enough to consider choice of species relevant in climate change adaptation strategy. Tree growth variables influence the capacity of the tree to sequester carbon. Height, diameter, volume, wood density and age are positively correlated to carbon sequestration. These tree growth variables could be manipulated by the forest manager as an adaptive strategy for climate change while plantations of high wood density speciescould be relevant for management strategy to increase carbon storage.

Keywords: adaptation, carbon sequestration, climate change, growth variables, wood density

Procedia PDF Downloads 378
28059 Optimizing Telehealth Internet of Things Integration: A Sustainable Approach through Fog and Cloud Computing Platforms for Energy Efficiency

Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo

Abstract:

The swift proliferation of telehealth Internet of Things (IoT) devices has sparked concerns regarding energy consumption and the need for streamlined data processing. This paper presents an energy-efficient model that integrates telehealth IoT devices into a platform based on fog and cloud computing. This integrated system provides a sustainable and robust solution to address the challenges. Our model strategically utilizes fog computing as a localized data processing layer and leverages cloud computing for resource-intensive tasks, resulting in a significant reduction in overall energy consumption. The incorporation of adaptive energy-saving strategies further enhances the efficiency of our approach. Simulation analysis validates the effectiveness of our model in improving energy efficiency for telehealth IoT systems, particularly when integrated with localized fog nodes and both private and public cloud infrastructures. Subsequent research endeavors will concentrate on refining the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability across various healthcare and industry sectors.

Keywords: energy-efficient, fog computing, IoT, telehealth

Procedia PDF Downloads 75
28058 In-Flight Aircraft Performance Model Enhancement Using Adaptive Lookup Tables

Authors: Georges Ghazi, Magali Gelhaye, Ruxandra Botez

Abstract:

Over the years, the Flight Management System (FMS) has experienced a continuous improvement of its many features, to the point of becoming the pilot’s primary interface for flight planning operation on the airplane. With the assistance of the FMS, the concept of distance and time has been completely revolutionized, providing the crew members with the determination of the optimized route (or flight plan) from the departure airport to the arrival airport. To accomplish this function, the FMS needs an accurate Aircraft Performance Model (APM) of the aircraft. In general, APMs that equipped most modern FMSs are established before the entry into service of an individual aircraft, and results from the combination of a set of ordinary differential equations and a set of performance databases. Unfortunately, an aircraft in service is constantly exposed to dynamic loads that degrade its flight characteristics. These degradations endow two main origins: airframe deterioration (control surfaces rigging, seals missing or damaged, etc.) and engine performance degradation (fuel consumption increase for a given thrust). Thus, after several years of service, the performance databases and the APM associated to a specific aircraft are no longer representative enough of the actual aircraft performance. It is important to monitor the trend of the performance deterioration and correct the uncertainties of the aircraft model in order to improve the accuracy the flight management system predictions. The basis of this research lies in the new ability to continuously update an Aircraft Performance Model (APM) during flight using an adaptive lookup table technique. This methodology was developed and applied to the well-known Cessna Citation X business aircraft. For the purpose of this study, a level D Research Aircraft Flight Simulator (RAFS) was used as a test aircraft. According to Federal Aviation Administration the level D is the highest certification level for the flight dynamics modeling. Basically, using data available in the Flight Crew Operating Manual (FCOM), a first APM describing the variation of the engine fan speed and aircraft fuel flow w.r.t flight conditions was derived. This model was next improved using the proposed methodology. To do that, several cruise flights were performed using the RAFS. An algorithm was developed to frequently sample the aircraft sensors measurements during the flight and compare the model prediction with the actual measurements. Based on these comparisons, a correction was performed on the actual APM in order to minimize the error between the predicted data and the measured data. In this way, as the aircraft flies, the APM will be continuously enhanced, making the FMS more and more precise and the prediction of trajectories more realistic and more reliable. The results obtained are very encouraging. Indeed, using the tables initialized with the FCOM data, only a few iterations were needed to reduce the fuel flow prediction error from an average relative error of 12% to 0.3%. Similarly, the FCOM prediction regarding the engine fan speed was reduced from a maximum error deviation of 5.0% to 0.2% after only ten flights.

Keywords: aircraft performance, cruise, trajectory optimization, adaptive lookup tables, Cessna Citation X

Procedia PDF Downloads 263