Search results for: single-phase models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6743

Search results for: single-phase models

3353 Comparisons of Individual and Group Replacement Policies for a Series Connection System with Two Machines

Authors: Wen Liang Chang, Mei Wei Wang, Ruey Huei Yeh

Abstract:

This paper studies the comparisons of individual and group replacement policies for a series connection system with two machines. Suppose that manufacturer’s production system is a series connection system which is combined by two machines. For two machines, when machines fail within the operating time, minimal repair is performed for machines by the manufacturer. The manufacturer plans to a preventive replacement for machines at a pre-specified time to maintain system normal operation. Under these maintenance policies, the maintenance cost rate models of individual and group replacement for a series connection system with two machines is derived and further, optimal preventive replacement time is obtained such that the expected total maintenance cost rate is minimized. Finally, some numerical examples are given to illustrate the influences of individual and group replacement policies to the maintenance cost rate.

Keywords: individual replacement, group replacement, replacement time, two machines, series connection system

Procedia PDF Downloads 488
3352 Microfluidic Method for Measuring Blood Viscosity

Authors: Eunseop Yeom

Abstract:

Many cardiovascular diseases, such as thrombosis and atherosclerosis, can change biochemical molecules in plasma and red blood cell. These alterations lead to excessive increase of blood viscosity contributing to peripheral vascular diseases. In this study, a simple microfluidic-based method is used to measure blood viscosity. Microfluidic device is composed of two parallel side channels and a bridge channel. To estimate blood viscosity, blood samples and reference fluid are separately delivered into each inlet of two parallel side channels using pumps. An interfacial line between blood samples and reference fluid occurs by blocking the outlet of one side-channel. Since width for this interfacial line is determined by pressure ratio between blood and reference flows, blood viscosity can be estimated by measuring width for this interfacial line. This microfluidic-based method can be used for evaluating variations in the viscosity of animal models with cardiovascular diseases under flow conditions.

Keywords: blood viscosity, microfluidic chip, pressure, shear rate

Procedia PDF Downloads 372
3351 ATM Location Problem and Cash Management in ATM's

Authors: M. Erol Genevois, D. Celik, H. Z. Ulukan

Abstract:

Automated teller machines (ATMs) can be considered among one of the most important service facilities in the banking industry. The investment in ATMs and the impact on the banking industry is growing steadily in every part of the world. The banks take into consideration many factors like safety, convenience, visibility, cost in order to determine the optimum locations of ATMs. Today, ATMs are not only available in bank branches but also at retail locations. Another important factor is the cash management in ATMs. A cash demand model for every ATM is needed in order to have an efficient cash management system. This forecasting model is based on historical cash demand data which is highly related to the ATMs location. So, the location and the cash management problem should be considered together. Although the literature survey on facility location models is quite large, it is surprising that there are only few studies which handle together ATMs location and cash management problem. In order to fulfill the gap, this paper provides a general review on studies, efforts and development in ATMs location and cash management problem.

Keywords: ATM location problem, cash management problem, ATM cash replenishment problem, literature review in ATMs

Procedia PDF Downloads 480
3350 Hyper Tuned RBF SVM: Approach for the Prediction of the Breast Cancer

Authors: Surita Maini, Sanjay Dhanka

Abstract:

Machine learning (ML) involves developing algorithms and statistical models that enable computers to learn and make predictions or decisions based on data without being explicitly programmed. Because of its unlimited abilities ML is gaining popularity in medical sectors; Medical Imaging, Electronic Health Records, Genomic Data Analysis, Wearable Devices, Disease Outbreak Prediction, Disease Diagnosis, etc. In the last few decades, many researchers have tried to diagnose Breast Cancer (BC) using ML, because early detection of any disease can save millions of lives. Working in this direction, the authors have proposed a hybrid ML technique RBF SVM, to predict the BC in earlier the stage. The proposed method is implemented on the Breast Cancer UCI ML dataset with 569 instances and 32 attributes. The authors recorded performance metrics of the proposed model i.e., Accuracy 98.24%, Sensitivity 98.67%, Specificity 97.43%, F1 Score 98.67%, Precision 98.67%, and run time 0.044769 seconds. The proposed method is validated by K-Fold cross-validation.

Keywords: breast cancer, support vector classifier, machine learning, hyper parameter tunning

Procedia PDF Downloads 67
3349 Validation of Codes Dragon4 and Donjon4 by Calculating Keff of a Slowpoke-2 Reactor

Authors: Otman Jai, Otman Elhajjaji, Jaouad Tajmouati

Abstract:

Several neutronic calculation codes must be used to solve the equation for different levels of discretization which all necessitate a specific modelisation. This chain of such models, known as a calculation scheme, leads to the knowledge of the neutron flux in a reactor from its own geometry, its isotopic compositions and a cross-section library. Being small in size, the 'Slowpoke-2' reactor is difficult to model due to the importance of the leaking neutrons. In the paper, the simulation model is presented (geometry, cross section library, assumption, etc.), and the results obtained by DRAGON4/DONJON4 codes were compared to the calculations performed with Monte Carlo code MCNP using detailed geometrical model of the reactor and the experimental data. Criticality calculations have been performed to verify and validate the model. Since created model properly describes the reactor core, it can be used for calculations of reactor core parameters and for optimization of research reactor application.

Keywords: transport equation, Dragon4, Donjon4, neutron flux, effective multiplication factor

Procedia PDF Downloads 470
3348 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 550
3347 VTOL-Fw Mode-Transitioning UAV Design and Analysis

Authors: Feri̇t Çakici, M. Kemal Leblebi̇ci̇oğlu

Abstract:

In this study, an unmanned aerial vehicle (UAV) with level flight, vertical take-off and landing (VTOL) and mode-transitioning capability is designed and analyzed. The platform design combines both multirotor and fixed-wing (FW) conventional airplane structures and control surfaces; therefore named as VTOL-FW. The aircraft is modeled using aerodynamical principles and linear models are constructed utilizing small perturbation theory for trim conditions. The proposed method of control includes implementation of multirotor and airplane mode controllers and design of an algorithm to transition between modes in achieving smooth switching maneuvers between VTOL and FW flight. Thus, VTOL-FW UAV’s flight characteristics are expected to be improved by enlarging operational flight envelope through enabling mode-transitioning, agile maneuvers and increasing survivability. Experiments conducted in simulation and real world environments shows that VTOL-FW UAV has both multirotor and airplane characteristics with extra benefits in an enlarged flight envelope.

Keywords: aircraft design, linear analysis, mode transitioning control, UAV

Procedia PDF Downloads 395
3346 South-Mediterranean Oaks Forests Management in Changing Climate Case of the National Park of Tlemcen-Algeria

Authors: K. Bencherif, M. Bellifa

Abstract:

The expected climatic changes in North Africa are the increase of both intensity and frequencies of the summer droughts and a reduction in water availability during growing season. The exiting coppices and forest formations in the national park of Tlemcen are dominated by holm oak, zen oak and cork oak. These opened-fragmented structures don’t seem enough strong so to hope durable protection against climate change. According to the observed climatic tendency, the objective is to analyze the climatic context and its evolution taking into account the eventual behaving of the oak species during the next 20-30 years on one side and the landscaped context in relation with the most adequate sylvicultural models to choose and especially in relation with human activities on another side. The study methodology is based on Climatic synthesis and Floristic and spatial analysis. Meteorological data of the decade 1989-2009 are used to characterize the current climate. An another approach, based on dendrochronological analysis of a 120 years sample Aleppo pine stem growing in the park, is used so to analyze the climate evolution during one century. Results on the climate evolution during the 50 years obtained through climatic predictive models are exploited so to predict the climate tendency in the park. Spatially, in each forest unit of the Park, stratified sampling is achieved so to reduce the degree of heterogeneity and to easily delineate different stands using the GPS. Results from precedent study are used to analyze the anthropogenic factor considering the forecasts for the period 2025-2100, the number of warm days with a temperature over 25°C would increase from 30 to 70. The monthly mean temperatures of the maxima’s (M) and the minima’s (m) would pass respectively from 30.5°C to 33°C and from 2.3°C to 4.8°C. With an average drop of 25%, precipitations will be reduced to 411.37 mm. These new data highlight the importance of the risk fire and the water stress witch would affect the vegetation and the regeneration process. Spatial analysis highlights the forest and the agricultural dimensions of the park compared to the urban habitat and bare soils. Maps show both fragmentation state and forest surface regression (50% of total surface). At the level of the park, fires affected already all types of covers creating low structures with various densities. On the silvi cultural plan, Zen oak form in some places pure stands and this invasion must be considered as a natural tendency where Zen oak becomes the structuring specie. Climate-related changes have nothing to do with the real impact that South-Mediterranean forests are undergoing because human constraints they support. Nevertheless, hardwoods stand of oak in the national park of Tlemcen will face up to unexpected climate changes such as changing rainfall regime associated with a lengthening of the period of water stress, to heavy rainfall and/or to sudden cold snaps. Faced with these new conditions, management based on mixed uneven aged high forest method promoting the more dynamic specie could be an appropriate measure.

Keywords: global warming, mediterranean forest, oak shrub-lands, Tlemcen

Procedia PDF Downloads 389
3345 Application and Verification of Regression Model to Landslide Susceptibility Mapping

Authors: Masood Beheshtirad

Abstract:

Identification of regions having potential for landslide occurrence is one of the basic measures in natural resources management. Different landslide hazard mapping models are proposed based on the environmental condition and goals. In this research landslide hazard map using multiple regression model were provided and applicability of this model is investigated in Baghdasht watershed. Dependent variable is landslide inventory map and independent variables consist of information layers as Geology, slope, aspect, distance from river, distance from road, fault and land use. For doing this, existing landslides have been identified and an inventory map made. The landslide hazard map is based on the multiple regression provided. The level of similarity potential hazard classes and figures of this model were compared with the landslide inventory map in the SPSS environments. Results of research showed that there is a significant correlation between the potential hazard classes and figures with area of the landslides. The multiple regression model is suitable for application in the Baghdasht Watershed.

Keywords: landslide, mapping, multiple model, regression

Procedia PDF Downloads 324
3344 Influence of the Low Frequency Ultrasound on the Cadmium (II) Biosorption by an Ecofriendly Biocomposite (Extraction Solid Waste of Ammi visnaga / Calcium Alginate): Kinetic Modeling

Authors: L. Nouri Taiba, Y. Bouhamidi, F. Kaouah, Z. Bendjama, M. Trari

Abstract:

In the present study, an ecofriendly biocomposite namely calcium alginate immobilized Ammi Visnaga (Khella) extraction waste (SWAV/CA) was prepared by electrostatic extrusion method and used on the cadmium biosorption from aqueous phase with and without the assistance of ultrasound in batch conditions. The influence of low frequency ultrasound (37 and 80 KHz) on the cadmium biosorption kinetics was studied. The obtained results show that the ultrasonic irradiation significantly enhances and improves the efficiency of the cadmium removal. The Pseudo first order, Pseudo-second-order, Intraparticle diffusion, and Elovich models were evaluated using the non-linear curve fitting analysis method. Modeling of kinetic results shows that biosorption process is best described by the pseudo-second order and Elovich, in both the absence and presence of ultrasound.

Keywords: biocomposite, biosorption, cadmium, non-linear analysis, ultrasound

Procedia PDF Downloads 277
3343 Assessing Efficiency Trends in the Indian Sugar Industry

Authors: S. P. Singh

Abstract:

This paper measures technical and scale efficiencies of 40 Indian sugar companies for the period from 2004-05 to 2013-14. The efficiencies are estimated through input-oriented DEA models using one output variable—value of output (VOP) and five input variables—capital cost (CA), employee cost (EMP), raw material (RW), energy & fuel (E&F) and other manufacturing expenses (OME). The sugar companies are classified into integrated and non-integrated categories to know which one achieves higher level of efficiency. Sources of inefficiency in the industry are identified through decomposing the overall technical efficiency (TE) into pure technical efficiency (PTE) and scale efficiency (SE). The paper also estimates input-reduction targets for relatively inefficient companies and suggests measures to improve their efficiency level. The findings reveal that the TE does not evince any trend rather it shows fluctuations across years, largely due to erratic and cyclical pattern of sugar production. Further, technical inefficiency in the industry seems to be driven more by the managerial inefficiency than the scale inefficiency, which implies that TE can be improved through better conversion of inputs into output.

Keywords: DEA, slacks, sugar industry, technical efficiency

Procedia PDF Downloads 318
3342 Interdisciplinary Method Development - A Way to Realize the Full Potential of Textile Resources

Authors: Nynne Nørup, Julie Helles Eriksen, Rikke M. Moalem, Else Skjold

Abstract:

Despite a growing focus on the high environmental impact of textiles, textile waste is only recently considered as part of the waste field. Consequently, there is a general lack of knowledge and data within this field. Particularly the lack of a common perception of textiles generates several problems e.g., to recognize the full material potential the fraction contains, which is cruel if the textile must enter the circular economy. This study aims to qualify a method to make the resources in textile waste visible in a way that makes it possible to move them as high up in the waste hierarchy as possible. Textiles are complex and cover many different types of products, fibers and combinations of fibers and production methods. In garments alone, there is a great variety, even when narrowing it to only undergarments. However, textile waste is often reduced to one fraction, assessed solely by quantity, and compared to quantities of other waste fractions. Disregarding the complexity and reducing textiles to a single fraction that covers everything made of textiles increase the risk of neglecting the value of the materials, both with regards to their properties and economical. Instead of trying to fit textile waste into the current primarily linear waste system where volume is a key part of the business models, this study focused on integrating textile waste as a resource in the design and production phase. The study combined interdisciplinary methods for determining replacement rates used in Life Cycle Assessments and Mass Flow Analysis methods with the designer’s toolbox to hereby activate the properties of textile waste in a way that can unleash its potential optimally. It was hypothesized that by activating Denmark's tradition for design and high level of craftsmanship, it is possible to find solutions that can be used today and create circular resource models that reduce the use of virgin fibers. Through waste samples, case studies, and testing of various design approaches, this study explored how to functionalize the method so that the product after the end-use is kept as a material and only then processed at fiber level to obtain the best environmental utilization. The study showed that the designers' ability to decode the properties of the materials and understanding of craftsmanship were decisive for how well the materials could be utilized today. The later in the life cycle the textiles appeared as waste, the more demanding the description of the materials to be sufficient, especially if to achieve the best possible use of the resources and thus a higher replacement rate. In addition, it also required adaptation in relation to the current production because the materials often varied more. The study found good indications that part of the solution is to use geodata i.e., where in the life cycle the materials were discarded. An important conclusion is that a fully developed method can help support better utilization of textile resources. However, it stills requires a better understanding of materials by the designers, as well as structural changes in business and society.

Keywords: circular economy, development of sustainable processes, environmental impacts, environmental management of textiles, environmental sustainability through textile recycling, interdisciplinary method development, resource optimization, recycled textile materials and the evaluation of recycling, sustainability and recycling opportunities in the textile and apparel sector

Procedia PDF Downloads 95
3341 Barriers to Business Model Innovation in the Agri-Food Industry

Authors: Pia Ulvenblad, Henrik Barth, Jennie Cederholm BjöRklund, Maya Hoveskog, Per-Ola Ulvenblad

Abstract:

The importance of business model innovation (BMI) is widely recognized. This is also valid for firms in the agri-food industry, closely connected to global challenges. Worldwide food production will have to increase 70% by 2050 and the United Nations’ sustainable development goals prioritize research and innovation on food security and sustainable agriculture. The firms of the agri-food industry have opportunities to increase their competitive advantage through BMI. However, the process of BMI is complex and the implementation of new business models is associated with high degree of risk and failure. Thus, managers from all industries and scholars need to better understand how to address this complexity. Therefore, the research presented in this paper (i) explores different categories of barriers in research literature on business models in the agri-food industry, and (ii) illustrates categories of barriers with empirical cases. This study is addressing the rather limited understanding on barriers for BMI in the agri-food industry, through a systematic literature review (SLR) of 570 peer-reviewed journal articles that contained a combination of ‘BM’ or ‘BMI’ with agriculture-related and food-related terms (e.g. ‘agri-food sector’) published in the period 1990-2014. The study classifies the barriers in several categories and illustrates the identified barriers with ten empirical cases. Findings from the literature review show that barriers are mainly identified as outcomes. It can be assumed that a perceived barrier to growth can often be initially exaggerated or underestimated before being challenged by appropriate measures or courses of action. What may be considered by the public mind to be a barrier could in reality be very different from an actual barrier that needs to be challenged. One way of addressing barriers to growth is to define barriers according to their origin (internal/external) and nature (tangible/intangible). The framework encompasses barriers related to the firm (internal addressing in-house conditions) or to the industrial or national levels (external addressing environmental conditions). Tangible barriers can include asset shortages in the area of equipment or facilities, while human resources deficiencies or negative willingness towards growth are examples of intangible barriers. Our findings are consistent with previous research on barriers for BMI that has identified human factors barriers (individuals’ attitudes, histories, etc.); contextual barriers related to company and industry settings; and more abstract barriers (government regulations, value chain position, and weather). However, human factor barriers – and opportunities - related to family-owned businesses with idealistic values and attitudes and owning the real estate where the business is situated, are more frequent in the agri-food industry than other industries. This paper contributes by generating a classification of the barriers for BMI as well as illustrating them with empirical cases. We argue that internal barriers such as human factors barriers; values and attitudes are crucial to overcome in order to develop BMI. However, they can be as hard to overcome as for example institutional barriers such as governments’ regulations. Implications for research and practice are to focus on cognitive barriers and to develop the BMI capability of the owners and managers of agri-industry firms.

Keywords: agri-food, barriers, business model, innovation

Procedia PDF Downloads 233
3340 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 177
3339 Effect of Amlodipine on Dichlorvos-Induced Seizure in Mice

Authors: Omid Ghollipoor Bashiri, Farzam Hatefi

Abstract:

Dichlorvos a synthetic organophosphate poisons are used as insecticide. These toxins can be used insecticides in agriculture and medicine for destruction and/or eradication of ectoparasites of animals. Studies have shown that Dichlorvos creation seizure effects in different animals. Amlodipine, dihydropyridine calcium channel blockers, widely used for treatment of cardiovascular diseases. Studies have shown that the calcium channel blockers are anticonvulsant effects in different animal models. The aim of this study was to determine the effect of Amlodipine on Dichlorvos-induced seizures in mice. In this experiment, the animals were received different doses of Amlodipine (2.5, 5, 10, 20 and 40 mg/ kg b.wt.) intraperitoneally 30 min before intraperitoneal injection of Dichlorvos (50 mg/kg b.wt). After Dichlorvos injection, clonic and tonic seizures, and finally was the fate was investigated. Results showed that Amlodipine dose-dependently reduced the severity of Dichlorvos-induced seizures, so that Amlodipine at a dose of 5mg (The lowest, p<0.05) and 40 mg/kg b.wt. (The highest, p<0.001) which had anticonvulsant effects. The anticonvulsant activity of Amlodipine suggests that possibly due to the antagonistic effect on voltage-dependent calcium channel.

Keywords: dichlorvos, amlodipine, seizures, mice

Procedia PDF Downloads 306
3338 Effects of Aging on Thermal Properties of Some Improved Varieties of Cassava (Manihot Esculenta) Roots

Authors: K. O. Oriola, A. O. Raji, O. E. Akintola, O. T. Ismail

Abstract:

Thermal properties of roots of three improved cassava varieties (TME419, TMS 30572, and TMS 0326) were determined on samples harvested at 12, 15 and 18 Months After Planting (MAP) conditioned to moisture contents of 50, 55, 60, 65, 70% (wb). Thermal conductivity at 12, 15 and 18 MAP ranged 0.4770 W/m.K to 0.6052W/m.K; 0.4804 W/m.K to 0.5530 W/m.K and 0.3764 to 0.6102 W/m.K respectively, thermal diffusivity from 1.588 to 2.426 x 10-7m2/s; 1.290 to 2.010 x 10-7m2/s and 0.1692 to 4.464 x 10-7m2/s and specific heat capacity from 2.3626 to 3.8991 kJ/kg.K; 1.8110 to 3.9703 kJ/kgK and 1.7311 to 3.8830 kJ/kg.K respectively within the range of moisture content studied across the varieties. None of the samples over the ages studied showed similar or definite trend in variation with others across the moisture content. However, second order polynomial models fitted all the data. Age on the other hand had a significant effect on the three thermal properties studied for TME 419 but not on thermal conductivity of TMS30572 and specific heat capacity of TMS 0326. Information obtained will provide better insight into thermal processing of cassava roots into stable products.

Keywords: thermal conductivity, thermal diffusivity, specific heat capacity, moisture content, tuber age

Procedia PDF Downloads 520
3337 Numerical Simulation of Wishart Diffusion Processes

Authors: Raphael Naryongo, Philip Ngare, Anthony Waititu

Abstract:

This paper deals with numerical simulation of Wishart processes for a single asset risky pricing model whose volatility is described by Wishart affine diffusion processes. The multi-factor specification of volatility will make the model more flexible enough to fit the stock market data for short or long maturities for better returns. The Wishart process is a stochastic process which is a positive semi-definite matrix-valued generalization of the square root process. The aim of the study is to model the log asset stock returns under the double Wishart stochastic volatility model. The solution of the log-asset return dynamics for Bi-Wishart processes will be obtained through Euler-Maruyama discretization schemes. The numerical results on the asset returns are compared to the existing models returns such as Heston stochastic volatility model and double Heston stochastic volatility model

Keywords: euler schemes, log-asset return, infinitesimal generator, wishart diffusion affine processes

Procedia PDF Downloads 378
3336 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 67
3335 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 478
3334 The Effect of Phase Development on Micro-Climate Change of Urban Area

Authors: Tommy Lo

Abstract:

This paper presented the changes in temperature and air ventilation of an inner urban area at different development stages during 2002 to 2012 and the high-rise buildings to be built in 2018. 3D simulation models ENVI-met and Autodesk Falcon were used. The results indicated that replacement of old residence buildings or open space with high-rise buildings will increase the air temperature of inner urban area; the air temperature at the pedestrian level will increase more than that at the upper levels. The temperature of the inner street in future will get higher than that in 2002, 2008 and 2012. It is attributed that heat is trapped in the street canyons as the air permeability at the pedestrian levels is lower. High-rise buildings with massive podium will further reduce the air ventilation in that area. In addition, sufficient separations among buildings is essential in design. High-rise buildings aligned along the waterfront will obstruct the wind flowing into the inner urban area and accelerate the temperature increase both in daytime and night time.

Keywords: micro-climate change, urban design, ENVI-met, construction engineering

Procedia PDF Downloads 282
3333 Red Meat Price Volatility and Its' Relationship with Crude Oil and Exchange Rate

Authors: Melek Akay

Abstract:

Turkey's agricultural commodity prices are prone to fluctuation but have gradually over time. A considerable amount of literature examines the changes in these prices by dealing with other commodities such as energy. Links between agricultural and energy markets have therefore been extensively investigated. Since red meat prices are becoming increasingly volatile in Turkey, this paper analyses the price volatility of veal, lamb and the relationship between red meat and crude oil, exchange rates by applying the generalize all period unconstraint volatility model, which generalises the GARCH (p, q) model for analysing weekly data covering a period of May 2006 to February 2017. Empirical results show that veal and lamb prices present volatility during the last decade, but particularly between 2009 and 2012. Moreover, oil prices have a significant effect on veal and lamb prices as well as their previous periods. Consequently, our research can lead policy makers to evaluate policy implementation in the appropriate way and reduce the impacts of oil prices by supporting producers.

Keywords: red meat price, volatility, crude oil, exchange rates, GARCH models, Turkey

Procedia PDF Downloads 122
3332 On the Importance of Quality, Liquidity Level and Liquidity Risk: A Markov-Switching Regime Approach

Authors: Tarik Bazgour, Cedric Heuchenne, Danielle Sougne

Abstract:

We examine time variation in the market beta of portfolios sorted on quality, liquidity level and liquidity beta characteristics across stock market phases. Using US stock market data for the period 1970-2010, we find, first, the US stock market was driven by four regimes. Second, during the crisis regime, low (high) quality, high (low) liquidity beta and illiquid (liquid) stocks exhibit an increase (a decrease) in their market betas. This finding is consistent with the flight-to-quality and liquidity phenomena. Third, we document the same pattern across stocks when the market volatility is low. We argue that, during low volatility times, investors shift their portfolios towards low quality and illiquid stocks to seek portfolio gains. The pattern observed in the tranquil regime can be, therefore, explained by a flight-to-low-quality and to illiquidity. Finally, our results reveal that liquidity level is more important than liquidity beta during the crisis regime.

Keywords: financial crises, quality, liquidity, liquidity risk, regime-switching models

Procedia PDF Downloads 404
3331 Influence of Cationic Surfactant (TTAB) on the Rate of Dipeptide (Gly-DL-Asp) Ninhydrin Reaction in Absence and Presence of Organic Solvents

Authors: Mohd. Akram, A. A. M. Saeed

Abstract:

Surfactants are widely used in our daily life either directly in household and personal care products or indirectly in the industrial processes. The kinetics of the interaction of glycyl-DL-aspartic acid (Gly-DL-Asp) with ninhydrin has been investigated spectrophotometrically in aqueous and organic-solvent media in the absence and presence of cationic surfactant of tetradecyltrimethylammonium bromide (TTAB). The study was carried out under different experimental conditions. The first and fractional order-rate were observed for [Gly-DL-Asp] and [ninhydrin], respectively. The reaction was enhanced about four-fold by TTAB micelles. The effect of organic solvents was studied at a constant concentration of TTAB and showed an increase in the absorbance as well as the rate constant for the formation of product (Ruhemann's purple). The results obtained in micellar media are treated quantitatively in terms of pseudo-phase and Piszkiewicz cooperativity models. The Arrhenius and Eyring equations are valid for the reaction over the range of temperatures used and different activation parameters (Ea, ∆H#, ∆S#, and ∆G#) have been evaluated.

Keywords: glycyl-DL-aspartic acid, ninhydrin, organic solvents, TTAB

Procedia PDF Downloads 384
3330 Antitrypanosomal Activity of Stigmasterol: An in silico Approach

Authors: Mohammed Auwal Ibrahim, Aminu Mohammed

Abstract:

Stigmasterol has previously been reported to possess antitrypanosomal activity using in vitro and in vivo models. However, the mechanism of antitrypanosomal activity is yet to be elucidated. In the present study, molecular docking was used to decipher the mode of interaction and binding affinity of stigmasterol to three known antitrypanosomal drug targets viz; adenosine kinase, ornithine decarboxylase and triose phosphate isomerase. Stigmasterol was found to bind to the selected trypanosomal enzymes with minimum binding energy of -4.2, -6.5 and -6.6 kcal/mol for adenosine kinase, ornithine decarboxylase, and triose phosphate isomerase respectively. However, hydrogen bond was not involved in the interaction of stigmasterol with all the three enzymes, but hydrophobic interaction seemed to play a vital role in the binding phenomenon which was predicted to be non-competitive like type of inhibition. It was concluded that binding to the three selected enzymes, especially triose phosphate isomerase, might be involved in the antitrypanosomal activity of stigmasterol but not mediated via a hydrogen bond interaction.

Keywords: antitrypanosomal, in silico, molecular docking, stigmasterol

Procedia PDF Downloads 278
3329 Seismic Response Analysis of Frame Structures Based on Super Joint Element Model

Authors: Li Xu, Yang Hong, T. Zhao Wen

Abstract:

Experimental results of many RC beam-column subassemblage indicate that slippage of longitudinal beam rebar within the joint and the shear deformation of joint core have significant influence on seismic behavior of the subassemblage. However, rigid joint assumption has been generally used in the seismic response analysis of RC frames, in which two kinds of inelastic deformation of joint have been ignored. Based on OpenSees platform, ‘Super Joint Element Model’ with more detailed inelastic mechanism is used to simulate the inelastic response of joints. Two finite element models of typical RC plane frame, namely considering or ignoring the inelastic deformation of joint respectively, were established and analyzed under seven strong earthquake waves. The simulated global and local inelastic deformations of the RC plane frame is shown and discussed. The analyses also confirm the security of the earthquake-resistant frame designed according to Chinese codes.

Keywords: frame structure, beam-column joint, longitudinal bar slippage, shear deformation, nonlinear analysis

Procedia PDF Downloads 409
3328 Integration of Fuzzy Logic in the Representation of Knowledge: Application in the Building Domain

Authors: Hafida Bouarfa, Mohamed Abed

Abstract:

The main object of our work is the development and the validation of a system indicated Fuzzy Vulnerability. Fuzzy Vulnerability uses a fuzzy representation in order to tolerate the imprecision during the description of construction. At the the second phase, we evaluated the similarity between the vulnerability of a new construction and those of the whole of the historical cases. This similarity is evaluated on two levels: 1) individual similarity: bases on the fuzzy techniques of aggregation; 2) Global similarity: uses the increasing monotonous linguistic quantifiers (RIM) to combine the various individual similarities between two constructions. The third phase of the process of Fuzzy Vulnerability consists in using vulnerabilities of historical constructions narrowly similar to current construction to deduce its estimate vulnerability. We validated our system by using 50 cases. We evaluated the performances of Fuzzy Vulnerability on the basis of two basic criteria, the precision of the estimates and the tolerance of the imprecision along the process of estimation. The comparison was done with estimates made by tiresome and long models. The results are satisfactory.

Keywords: case based reasoning, fuzzy logic, fuzzy case based reasoning, seismic vulnerability

Procedia PDF Downloads 292
3327 Numerical Solution of a Mathematical Model of Vortex Using Projection Method: Applications to Tornado Dynamics

Authors: Jagdish Prasad Maurya, Sanjay Kumar Pandey

Abstract:

Inadequate understanding of the complex nature of flow features in tornado vortex is a major problem in modelling tornadoes. Tornadoes are violent atmospheric phenomenon that appear all over the world. Modelling tornadoes aim to reduce the loss of the human lives and material damage caused by the tornadoes. Dynamics of tornado is investigated by a numerical technique, the improved version of the projection method. In this paper, authors solve the problem for axisymmetric tornado vortex by the said method that uses a finite difference approach for getting an accurate and stable solution. The conclusions drawn are that large radial inflow velocity occurs near the ground that leads to increase the tangential velocity. The increased velocity phenomenon occurs close to the boundary and absolute maximum wind is obtained near the vortex core. The results validate previous numerical and theoretical models.

Keywords: computational fluid dynamics, mathematical model, Navier-Stokes equations, tornado

Procedia PDF Downloads 353
3326 Critical Review Whether Restricting Dietary Saturated Fat Can Reduce the Risk of Cardiovascular Disease

Authors: Obi Olor, Asu-Nnandi Judith, Ishiekwen Bridget

Abstract:

Regardless of the settled perception that the substitution of saturated fats for starches or unsaturated fats builds low-density lipoprotein (LDL) cholesterol in people, in animal models, the relationship of saturated fat intake to the hazard of atherosclerotic cardiovascular ailment in people remains controversial. Clinical trials that supplanted immersed fat with polyunsaturated fat have, for the most part, demonstrated a lessening in CVD occasions, albeit a few reviews demonstrated no impacts. An autonomous relationship of soaked fat admission with CVD chance has not reliably appeared in planned epidemiologic reviews, albeit some have confirmed an expanded hazard in youthful people and in ladies. Substitution of soaked fat by polyunsaturated or monounsaturated fat reduces LDL and HDL cholesterol. Given the differing qualities of these cardio-protective eating methodologies and their healthy parts, one of the needs in research should be to attempt more near trials. These trials decide persistent worthiness, consequences for surrogate markers of hazard, and which at last affects morbidity and mortality.

Keywords: cardiovascular disease, dietary saturated fat, saturated fat, unsaturated fat

Procedia PDF Downloads 37
3325 Continuous Improvement Model for Creative Industries Development

Authors: Rolandas Strazdas, Jurate Cerneviciute

Abstract:

Creative industries are defined as those industries which produce tangible or intangible artistic and creative output and have a potential for income generation by exploitingcultural assets and producing knowledge-based goods and services (both traditional and contemporary). With the emergence of an entire sector of creative industriestriggered by the development of creative products managingcreativity-based business processes becomes a critical issue. Diverse managerial practices and models on effective management of creativity have beenexamined in scholarly literature. Even thoughthese studies suggest how creativity in organisations can be nourished, they do not sufficiently relate the proposed practices to the underlying business processes. The article analyses a range of business process improvement methods such as PDCA, DMAIC, DMADV and TOC. The strengths and weaknesses of these methods aimed to improvethe innovation development process are identified. Based on the analysis of the existing improvement methods, a continuous improvement model was developed and presented in the article.

Keywords: continuous improvement, creative industries, improvement model, process mapping

Procedia PDF Downloads 468
3324 A Review of Attractor Neural Networks and Their Use in Cognitive Science

Authors: Makenzy Lee Gilbert

Abstract:

This literature review explores the role of attractor neural networks (ANNs) in modeling psychological processes in artificial and biological systems. By synthesizing research from dynamical systems theory, psychology, and computational neuroscience, the review provides an overview of the current understanding of ANN function in memory formation, reinforcement, retrieval, and forgetting. Key mathematical foundations, including dynamical systems theory and energy functions, are discussed to explain the behavior and stability of these networks. The review also examines empirical applications of ANNs in cognitive processes such as semantic memory and episodic recall, as well as highlighting the hippocampus's role in pattern separation and completion. The review addresses challenges like catastrophic forgetting and noise effects on memory retrieval. By identifying gaps between theoretical models and empirical findings, it highlights the interdisciplinary nature of ANN research and suggests future exploration areas.

Keywords: attractor neural networks, connectionism, computational modeling, cognitive neuroscience

Procedia PDF Downloads 28