Search results for: linear static analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30505

Search results for: linear static analysis

27775 Trajectory Optimization for Autonomous Deep Space Missions

Authors: Anne Schattel, Mitja Echim, Christof Büskens

Abstract:

Trajectory planning for deep space missions has become a recent topic of great interest. Flying to space objects like asteroids provides two main challenges. One is to find rare earth elements, the other to gain scientific knowledge of the origin of the world. Due to the enormous spatial distances such explorer missions have to be performed unmanned and autonomously. The mathematical field of optimization and optimal control can be used to realize autonomous missions while protecting recourses and making them safer. The resulting algorithms may be applied to other, earth-bound applications like e.g. deep sea navigation and autonomous driving as well. The project KaNaRiA ('Kognitionsbasierte, autonome Navigation am Beispiel des Ressourcenabbaus im All') investigates the possibilities of cognitive autonomous navigation on the example of an asteroid mining mission, including the cruise phase and approach as well as the asteroid rendezvous, landing and surface exploration. To verify and test all methods an interactive, real-time capable simulation using virtual reality is developed under KaNaRiA. This paper focuses on the specific challenge of the guidance during the cruise phase of the spacecraft, i.e. trajectory optimization and optimal control, including first solutions and results. In principle there exist two ways to solve optimal control problems (OCPs), the so called indirect and direct methods. The indirect methods are being studied since several decades and their usage needs advanced skills regarding optimal control theory. The main idea of direct approaches, also known as transcription techniques, is to transform the infinite-dimensional OCP into a finite-dimensional non-linear optimization problem (NLP) via discretization of states and controls. These direct methods are applied in this paper. The resulting high dimensional NLP with constraints can be solved efficiently by special NLP methods, e.g. sequential quadratic programming (SQP) or interior point methods (IP). The movement of the spacecraft due to gravitational influences of the sun and other planets, as well as the thrust commands, is described through ordinary differential equations (ODEs). The competitive mission aims like short flight times and low energy consumption are considered by using a multi-criteria objective function. The resulting non-linear high-dimensional optimization problems are solved by using the software package WORHP ('We Optimize Really Huge Problems'), a software routine combining SQP at an outer level and IP to solve underlying quadratic subproblems. An application-adapted model of impulsive thrusting, as well as a model of an electrically powered spacecraft propulsion system, is introduced. Different priorities and possibilities of a space mission regarding energy cost and flight time duration are investigated by choosing different weighting factors for the multi-criteria objective function. Varying mission trajectories are analyzed and compared, both aiming at different destination asteroids and using different propulsion systems. For the transcription, the robust method of full discretization is used. The results strengthen the need for trajectory optimization as a foundation for autonomous decision making during deep space missions. Simultaneously they show the enormous increase in possibilities for flight maneuvers by being able to consider different and opposite mission objectives.

Keywords: deep space navigation, guidance, multi-objective, non-linear optimization, optimal control, trajectory planning.

Procedia PDF Downloads 412
27774 Neural Network in Fixed Time for Collision Detection between Two Convex Polyhedra

Authors: M. Khouil, N. Saber, M. Mestari

Abstract:

In this paper, a different architecture of a collision detection neural network (DCNN) is developed. This network, which has been particularly reviewed, has enabled us to solve with a new approach the problem of collision detection between two convex polyhedra in a fixed time (O (1) time). We used two types of neurons, linear and threshold logic, which simplified the actual implementation of all the networks proposed. The study of the collision detection is divided into two sections, the collision between a point and a polyhedron and then the collision between two convex polyhedra. The aim of this research is to determine through the AMAXNET network a mini maximum point in a fixed time, which allows us to detect the presence of a potential collision.

Keywords: collision identification, fixed time, convex polyhedra, neural network, AMAXNET

Procedia PDF Downloads 423
27773 A Generalization of the Secret Sharing Scheme Codes Over Certain Ring

Authors: Ibrahim Özbek, Erdoğan Mehmet Özkan

Abstract:

In this study, we generalize (k,n) threshold secret sharing scheme on the study Ozbek and Siap to the codes over the ring Fq+ αFq. In this way, it is mentioned that the method obtained in that article can also be used on codes over rings, and new advantages to be obtained. The method of securely sharing the key in cryptography, which Shamir first systematized and Massey carried over to codes, became usable for all error-correcting codes. The firewall of this scheme is based on the hardness of the syndrome decoding problem. Also, an open study area is left for those working for other rings and code classes. All codes that correct errors with this method have been the working area of this method.

Keywords: secret sharing scheme, linear codes, algebra, finite rings

Procedia PDF Downloads 75
27772 Quadrotor in Horizontal Motion Control and Maneuverability

Authors: Ali Oveysi Sarabi

Abstract:

In this paper, controller design for the attitude and altitude dynamics of an outdoor quadrotor, which is constructed with low cost actuators and drivers, is aimed. Before designing the controller, the quadrotor is modeled mathematically in Matlab-Simulink environment. To control attitude dynamics, linear quadratic regulator (LQR) based controllers are designed, simulated and applied to the system. Two different proportional-integral-derivative action (PID) controllers are designed to control yaw and altitude dynamics. During the implementation of the designed controllers, different test setups are used. Designed controllers are implemented and tuned on the real system using xPC Target. Tests show that these basic control structures are successful to control the attitude and altitude dynamics.

Keywords: helicopter balance, flight dynamics, autonomous landing, control robotics

Procedia PDF Downloads 510
27771 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho

Abstract:

Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.

Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem

Procedia PDF Downloads 294
27770 Linguistic Analysis of the Concept ‘Relation’ in Russian and English Languages

Authors: Nadezhda Obvintceva

Abstract:

The article gives the analysis of the concept ‘relation’ from the point of view of its realization in Russian and English languages on the basis of dictionaries articles. The analysis reveals the main difference of representation of this concept in both languages. It is the number of lexemes that express its general meanings. At the end of the article the author gives an explanation of possible causes of the difference and touches upon the issue about analytical phenomena in the vocabulary.

Keywords: concept, comparison, lexeme, meaning, relation, semantics

Procedia PDF Downloads 498
27769 Failure Analysis of a 304 Stainless Steel Flange Crack at Pipeline Transportation of Ethylene

Authors: Parisa Hasanpour, Bahram Borooghani, Vahid Asadi

Abstract:

In the current research, a catastrophic failure of a 304 stainless steel flange at pipeline transportation of ethylene in a petrochemical refinery was studied. Cracking was found in the flange after about 78840h service. Through the chemical analysis, tensile tests in addition to microstructural analysis such as optical microscopy and Scanning Electron Microscopy (SEM) on the failed part, it found that the fatigue was responsible for the fracture of the flange, which originated from bumps and depressions on the outer surface and propagated by vibration caused by the working condition.

Keywords: failure analysis, 304 stainless steel, fatigue, flange, petrochemical refinery

Procedia PDF Downloads 73
27768 Boosting Crime Scene Investigations Capabilities through Crime Script Analysis

Authors: Benoit Leclerc

Abstract:

The concept of scripts and the role that crime scripts has been playing in criminology during the last decade is reviewed. Particularly illuminating is the potential use of scripts not only to understand and disrupt offender scripts (e.g., commonly referred as crime scripts) but to capture victim and guardian scripts to increase the likelihood of preventing crime. Similarly, the concept of scripts is applied to forensic science – another field that can benefit from script analysis. First, similar to guardian scripts, script analysis can illuminate the process of completing crime scene investigations for those who investigate (crime scene investigators or other professionals involved in crime scene investigations), and as a result, provide a range of intervention-points to improve the success of these investigations. Second, script analysis can also provide valuable information on offenders’ crime-commission processes for crime scene investigators and highlight a number of ‘contact points’ that could be targeted during investigations.

Keywords: crime scripts, crime scene investigation, script analysis, situational crime prevention

Procedia PDF Downloads 275
27767 Modal Analysis of Power System with a Microgrid

Authors: Burak Yildirim, Muhsin Tunay Gençoğlu

Abstract:

A microgrid (MG) is a small power grid composed of localized medium or low level power generation, storage systems, and loads. In this paper, the effects of a MG on power systems voltage stability are shown. The MG model, designed to demonstrate the effects of the MG, was applied to the IEEE 14 bus power system which is widely used in power system stability studies. Eigenvalue and modal analysis methods were used in simulation studies. In the study results, it is seen that MGs affect system voltage stability positively by increasing system voltage instability limit value for buses of a power system in which MG are placed.

Keywords: eigenvalue analysis, microgrid, modal analysis, voltage stability

Procedia PDF Downloads 372
27766 Performance of the Cmip5 Models in Simulation of the Present and Future Precipitation over the Lake Victoria Basin

Authors: M. A. Wanzala, L. A. Ogallo, F. J. Opijah, J. N. Mutemi

Abstract:

The usefulness and limitations in climate information are due to uncertainty inherent in the climate system. For any given region to have sustainable development it is important to apply climate information into its socio-economic strategic plans. The overall objective of the study was to assess the performance of the Coupled Model Inter-comparison Project (CMIP5) over the Lake Victoria Basin. The datasets used included the observed point station data, gridded rainfall data from Climate Research Unit (CRU) and hindcast data from eight CMIP5. The methodology included trend analysis, spatial analysis, correlation analysis, Principal Component Analysis (PCA) regression analysis, and categorical statistical skill score. Analysis of the trends in the observed rainfall records indicated an increase in rainfall variability both in space and time for all the seasons. The spatial patterns of the individual models output from the models of MPI, MIROC, EC-EARTH and CNRM were closest to the observed rainfall patterns.

Keywords: categorical statistics, coupled model inter-comparison project, principal component analysis, statistical downscaling

Procedia PDF Downloads 368
27765 Rheological Modeling for Shape-Memory Thermoplastic Polymers

Authors: H. Hosseini, B. V. Berdyshev, I. Iskopintsev

Abstract:

This paper presents a rheological model for producing shape-memory thermoplastic polymers. Shape-memory occurs as a result of internal rearrangement of the structural elements of a polymer. A non-linear viscoelastic model was developed that allows qualitative and quantitative prediction of the stress-strain behavior of shape-memory polymers during heating. This research was done to develop a technique to determine the maximum possible change in size of heat-shrinkable products during heating. The rheological model used in this work was particularly suitable for defining process parameters and constructive parameters of the processing equipment.

Keywords: elastic deformation, heating, shape-memory polymers, stress-strain behavior, viscoelastic model

Procedia PDF Downloads 323
27764 Part Variation Simulations: An Industrial Case Study with an Experimental Validation

Authors: Narendra Akhadkar, Silvestre Cano, Christophe Gourru

Abstract:

Injection-molded parts are widely used in power system protection products. One of the biggest challenges in an injection molding process is shrinkage and warpage of the molded parts. All these geometrical variations may have an adverse effect on the quality of the product, functionality, cost, and time-to-market. The situation becomes more challenging in the case of intricate shapes and in mass production using multi-cavity tools. To control the effects of shrinkage and warpage, it is very important to correctly find out the input parameters that could affect the product performance. With the advances in the computer-aided engineering (CAE), different tools are available to simulate the injection molding process. For our case study, we used the MoldFlow insight tool. Our aim is to predict the spread of the functional dimensions and geometrical variations on the part due to variations in the input parameters such as material viscosity, packing pressure, mold temperature, melt temperature, and injection speed. The input parameters may vary during batch production or due to variations in the machine process settings. To perform the accurate product assembly variation simulation, the first step is to perform an individual part variation simulation to render realistic tolerance ranges. In this article, we present a method to simulate part variations coming from the input parameters variation during batch production. The method is based on computer simulations and experimental validation using the full factorial design of experiments (DoE). The robustness of the simulation model is verified through input parameter wise sensitivity analysis study performed using simulations and experiments; all the results show a very good correlation in the material flow direction. There exists a non-linear interaction between material and the input process variables. It is observed that the parameters such as packing pressure, material, and mold temperature play an important role in spread on functional dimensions and geometrical variations. This method will allow us in the future to develop accurate/realistic virtual prototypes based on trusted simulated process variation and, therefore, increase the product quality and potentially decrease the time to market.

Keywords: correlation, molding process, tolerance, sensitivity analysis, variation simulation

Procedia PDF Downloads 178
27763 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network

Authors: Gulfam Haider, sana danish

Abstract:

Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.

Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent

Procedia PDF Downloads 125
27762 Characteristics and Drivers of Greenhouse Gas (GHG) emissions from China’s Manufacturing Industry: A Threshold Analysis

Authors: Rong Yuan, Zhao Tao

Abstract:

Only a handful of literature have used to non-linear model to investigate the influencing factors of greenhouse gas (GHG) emissions in China’s manufacturing sectors. And there is a limit in investigating quantitatively and systematically the mechanism of correlation between economic development and GHG emissions considering inherent differences among manufacturing sub-sectors. Considering the sectorial characteristics, the manufacturing sub-sectors with various impacts of output on GHG emissions may be explained by different development modes in each manufacturing sub-sector, such as investment scale, technology level and the level of international competition. In order to assess the environmental impact associated with any specific level of economic development and explore the factors that affect GHG emissions in China’s manufacturing industry during the process of economic growth, using the threshold Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT) model, this paper investigated the influence impacts of GHG emissions for China’s manufacturing sectors of different stages of economic development. A data set from 28 manufacturing sectors covering an 18-year period was used. Results demonstrate that output per capita and investment scale contribute to increasing GHG emissions while energy efficiency, R&D intensity and FDI mitigate GHG emissions. Results also verify the nonlinear effect of output per capita on emissions as: (1) the Environmental Kuznets Curve (EKC) hypothesis is supported when threshold point RMB 31.19 million is surpassed; (2) the driving strength of output per capita on GHG emissions becomes stronger as increasing investment scale; (3) the threshold exists for energy efficiency with the positive coefficient first and negative coefficient later; (4) the coefficient of output per capita on GHG emissions decreases as R&D intensity increases. (5) FDI shows a reduction in elasticity when the threshold is compassed.

Keywords: China, GHG emissions, manufacturing industry, threshold STIRPAT model

Procedia PDF Downloads 428
27761 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety

Authors: Atheer Al-Nuaimi, Harry Evdorides

Abstract:

Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.

Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety

Procedia PDF Downloads 240
27760 Study of Electron Cyclotron Resonance Acceleration by Cylindrical TE₀₁₁ Mode

Authors: Oswaldo Otero, Eduardo A. Orozco, Ana M. Herrera

Abstract:

In this work, we present results from analytical and numerical studies of the electron acceleration by a TE₀₁₁ cylindrical microwave mode in a static homogeneous magnetic field under electron cyclotron resonance (ECR) condition. The stability of the orbits is analyzed using the particle orbit theory. In order to get a better understanding of the interaction wave-particle, we decompose the azimuthally electric field component as the superposition of right and left-hand circular polarization standing waves. The trajectory, energy and phase-shift of the electron are found through a numerical solution of the relativistic Newton-Lorentz equation in a finite difference method by the Boris method. It is shown that an electron longitudinally injected with an energy of 7 keV in a radial position r=Rc/2, being Rc the cavity radius, is accelerated up to energy of 90 keV by an electric field strength of 14 kV/cm and frequency of 2.45 GHz. This energy can be used to produce X-ray for medical imaging. These results can be used as a starting point for study the acceleration of electrons in a magnetic field changing slowly in time (GYRAC), which has some important applications as the electron cyclotron resonance ion proton accelerator (ECR-IPAC) for cancer therapy and to control plasma bunches with relativistic electrons.

Keywords: Boris method, electron cyclotron resonance, finite difference method, particle orbit theory, X-ray

Procedia PDF Downloads 159
27759 Miniaturized PVC Sensors for Determination of Fe2+, Mn2+ and Zn2+ in Buffalo-Cows’ Cervical Mucus Samples

Authors: Ahmed S. Fayed, Umima M. Mansour

Abstract:

Three polyvinyl chloride membrane sensors were developed for the electrochemical evaluation of ferrous, manganese and zinc ions. The sensors were used for assaying metal ions in cervical mucus (CM) of Egyptian river buffalo-cows (Bubalus bubalis) as their levels vary dependent on cyclical hormone variation during different phases of estrus cycle. The presented sensors are based on using ionophores, β-cyclodextrin (β-CD), hydroxypropyl β-cyclodextrin (HP-β-CD) and sulfocalix-4-arene (SCAL) for sensors 1, 2 and 3 for Fe2+, Mn2+ and Zn2+, respectively. Dioctyl phthalate (DOP) was used as the plasticizer in a polymeric matrix of polyvinylchloride (PVC). For increasing the selectivity and sensitivity of the sensors, each sensor was enriched with a suitable complexing agent, which enhanced the sensor’s response. For sensor 1, β-CD was mixed with bathophenanthroline; for sensor 2, porphyrin was incorporated with HP-β-CD; while for sensor 3, oxine was the used complexing agent with SCAL. Linear responses of 10-7-10-2 M with cationic slopes of 53.46, 45.01 and 50.96 over pH range 4-8 were obtained using coated graphite sensors for ferrous, manganese and zinc ionic solutions, respectively. The three sensors were validated, according to the IUPAC guidelines. The obtained results by the presented potentiometric procedures were statistically analyzed and compared with those obtained by atomic absorption spectrophotometric method (AAS). No significant differences for either accuracy or precision were observed between the two techniques. Successful application for the determination of the three studied cations in CM, for the purpose to determine the proper time for artificial insemination (AI) was achieved. The results were compared with those obtained upon analyzing the samples by AAS. Proper detection of estrus and correct time of AI was necessary to maximize the production of buffaloes. In this experiment, 30 multi-parous buffalo-cows were in second to third lactation and weighting 415-530 kg, and were synchronized with OVSynch protocol. Samples were taken in three times around ovulation, on day 8 of OVSynch protocol, on day 9 (20 h before AI) and on day 10 (1 h before AI). Beside analysis of trace elements (Fe2+, Mn2+ and Zn2+) in CM using the three sensors, the samples were analyzed for the three cations and also Cu2+ by AAS in the CM samples and blood samples. The results obtained were correlated with hormonal analysis of serum samples and ultrasonography for the purpose of determining of the optimum time of AI. The results showed significant differences and powerful correlation with Zn2+ composition of CM during heat phase and the ovulation time, indicating that the parameter could be used as a tool to decide optimal time of AI in buffalo-cows.

Keywords: PVC Sensors, buffalo-cows, cyclodextrins, atomic absorption spectrophotometry, artificial insemination, OVSynch protocol

Procedia PDF Downloads 219
27758 Stability of the Wellhead in the Seabed in One of the Marine Reservoirs of Iran

Authors: Mahdi Aghaei, Saeid Jamshidi, Mastaneh Hajipour

Abstract:

Effective factors on the mechanical wellbore stability are divided in to two categories: 1) Controllable factors, 2) Uncontrollable factors. The purpose of geo-mechanical modeling of wells is to determine the limit of controlled parameters change based on the stress regime at each point and by solving the governing equations the pore-elastic environment around the well. In this research, the mechanical analysis of wellbore stability was carried out for Soroush oilfield. For this purpose, the geo-mechanical model of the field is made using available data. This model provides the necessary parameters for obtaining the distribution of stress around the wellbore. Initially, a basic model was designed to perform various analysis, based on obtained data, using Abaqus software. All of the subsequent sensitivity analysis such as sensitivity analysis on porosity, permeability, etc. was done on the same basic model. The results obtained from these analysis gives various result such as: with the constant geomechanical parameters, and sensitivity analysis on porosity permeability is ineffective. After the most important parameters affecting the wellbore stability and instability are geo-mechanical parameters.

Keywords: wellbore stability, movement, stress, instability

Procedia PDF Downloads 203
27757 Optimization Model for Support Decision for Maximizing Production of Mixed Fruit Tree Farms

Authors: Andrés I. Ávila, Patricia Aros, César San Martín, Elizabeth Kehr, Yovana Leal

Abstract:

We consider a linear programming model to help farmers to decide if it is convinient to choose among three kinds of export fruits for their future investment. We consider area, investment, water, productivitiy minimal unit, and harvest restrictions and a monthly based model to compute the average income in five years. Also, conditions on the field as area, water availability and initia investment are required. Using the Chilean costs and dollar-peso exchange rate, we can simulate several scenarios to understand the possible risks associated to this market.

Keywords: mixed integer problem, fruit production, support decision model, fruit tree farms

Procedia PDF Downloads 457
27756 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis

Authors: N. R. N. Idris, S. Baharom

Abstract:

A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.

Keywords: aggregate data, combined-level data, individual patient data, meta-analysis

Procedia PDF Downloads 375
27755 Fatigue Tests of New Assembly Bolt Connections for Perspective Temporary Steel Railway Bridges

Authors: Marcela Karmazínová, Michal Štrba, Miln Pilgr

Abstract:

The paper deals with the problems of the actual behavior, failure mechanism and load-carrying capacity of the special bolt connection developed and intended for the assembly connections of truss main girders of perspective railway temporary steel bridges. Within the framework of this problem solution, several types of structural details of assembly joints have been considered as the conceptual structural design. Based on the preliminary evaluation of advantages or disadvantages of these ones, in principle two basic structural configurations so-called “tooth” and “splice-plate” connections have been selected for the subsequent detailed investigation. This investigation is mainly based on the experimental verification of the actual behavior, strain and failure mechanism and corresponding strength of the connection, and on its numerical modeling using FEM. This paper is focused only on the cyclic loading (fatigue) tests results of “splice-plate” connections and their evaluation, which have already been finished. Simultaneously with the fatigue tests, the static loading tests have been realized too, but these ones, as well as FEM numerical modeling, are not the subject of this paper.

Keywords: Bolt assembly connection, cyclic loading, failure mechanisms, fatigue strength, steel structure, structural detail category, temporary railway bridge

Procedia PDF Downloads 444
27754 Use of the SWEAT Analysis Approach to Determine the Effectiveness of a School's Implementation of Its Curriculum

Authors: Prakash Singh

Abstract:

The focus of this study is on the use of the SWEAT analysis approach to determine how effectively a school, as an organization, has implemented its curriculum. To gauge the feelings of the teaching staff, unstructured interviews were employed in this study, asking the participants for their ideas and opinions on each of the three identified aspects of the school: instructional materials, media and technology; teachers’ professional competencies; and the curriculum. This investigation was based on the five key components of the SWEAT model: strengths, weaknesses, expectations, abilities, and tensions. The findings of this exploratory study evoke the significance of the SWEAT achievement model as a tool for strategic analysis to be undertaken in any organization. The findings further affirm the usefulness of this analytical tool for human resource development. Employees have expectations, but competency gaps in their professional abilities may hinder them from fulfilling their tasks in terms of their job description. Also, tensions in the working environment can contribute to their experiences of tobephobia (fear of failure). The SWEAT analysis approach detects such shortcomings in any organization and can therefore culminate in the development of programmes to address such concerns. The strategic SWEAT analysis process can provide a clear distinction between success and failure, and between mediocrity and excellence in organizations. However, more research needs to be done on the effectiveness of the SWEAT analysis approach as a strategic analytical tool.

Keywords: SWEAT analysis, strategic analysis, tobephobia, competency gaps

Procedia PDF Downloads 507
27753 Effect of Infill’s in Influencing the Dynamic Responses of Multistoried Structures

Authors: Rahmathulla Noufal E.

Abstract:

Investigating the dynamic responses of high rise structures under the effect of siesmic ground motion is extremely important for the proper analysis and design of multitoried structures. Since the presence of infilled walls strongly influences the behaviour of frame systems in multistoried buildings, there is an increased need for developing guidelines for the analysis and design of infilled frames under the effect of dynamic loads for safe and proper design of buildings. In this manuscript, we evaluate the natural frequencies and natural periods of single bay single storey frames considering the effect of infill walls by using the Eigen value analysis and validating with SAP 2000 (free vibration analysis). Various parameters obtained from the diagonal strut model followed for the free vibration analysis is then compared with the Finite Element model, where infill is modeled as shell elements (four noded). We also evaluated the effect of various parameters on the natural periods of vibration obtained by free vibration analysis in SAP 2000 comparing them with those obtained by the empirical expressions presented in I.S. 1893(Part I)-2002.

Keywords: infilled frame, eigen value analysis, free vibration analysis, diagonal strut model, finite element model, SAP 2000, natural period

Procedia PDF Downloads 330
27752 Bioeconomic Modeling for the Sustainable Exploitation of Three Key Marine Species in Morocco

Authors: I .Ait El Harch, K. Outaaoui, Y. El Foutayeni

Abstract:

This study aims to deepen the understanding and optimize fishing activity in Morocco by holistically integrating biological and economic aspects. We develop a biological equilibrium model in which these competing species present their natural growth by logistic equations, taking into account density and competition between them. The integration of human intervention adds a realistic dimension to our model. A company specifically targets the three species, thus influencing population dynamics according to their fishing activities. The aim of this work is to determine the fishing effort that maximizes the company’s profit, taking into account the constraints associated with conserving ecosystem equilibrium.

Keywords: bioeconomical modeling, optimization techniques, linear complementarity problem LCP, biological equilibrium, maximizing profits

Procedia PDF Downloads 27
27751 The Artificial Intelligence Technologies Used in PhotoMath Application

Authors: Tala Toonsi, Marah Alagha, Lina Alnowaiser, Hala Rajab

Abstract:

This report is about the Photomath app, which is an AI application that uses image recognition technology, specifically optical character recognition (OCR) algorithms. The (OCR) algorithm translates the images into a mathematical equation, and the app automatically provides a step-by-step solution. The application supports decimals, basic arithmetic, fractions, linear equations, and multiple functions such as logarithms. Testing was conducted to examine the usage of this app, and results were collected by surveying ten participants. Later, the results were analyzed. This paper seeks to answer the question: To what level the artificial intelligence features are accurate and the speed of process in this app. It is hoped this study will inform about the efficiency of AI in Photomath to the users.

Keywords: photomath, image recognition, app, OCR, artificial intelligence, mathematical equations.

Procedia PDF Downloads 171
27750 An Explanatory Study Approach Using Artificial Intelligence to Forecast Solar Energy Outcome

Authors: Agada N. Ihuoma, Nagata Yasunori

Abstract:

Artificial intelligence (AI) techniques play a crucial role in predicting the expected energy outcome and its performance, analysis, modeling, and control of renewable energy. Renewable energy is becoming more popular for economic and environmental reasons. In the face of global energy consumption and increased depletion of most fossil fuels, the world is faced with the challenges of meeting the ever-increasing energy demands. Therefore, incorporating artificial intelligence to predict solar radiation outcomes from the intermittent sunlight is crucial to enable a balance between supply and demand of energy on loads, predict the performance and outcome of solar energy, enhance production planning and energy management, and ensure proper sizing of parameters when generating clean energy. However, one of the major problems of forecasting is the algorithms used to control, model, and predict performances of the energy systems, which are complicated and involves large computer power, differential equations, and time series. Also, having unreliable data (poor quality) for solar radiation over a geographical location as well as insufficient long series can be a bottleneck to actualization. To overcome these problems, this study employs the anaconda Navigator (Jupyter Notebook) for machine learning which can combine larger amounts of data with fast, iterative processing and intelligent algorithms allowing the software to learn automatically from patterns or features to predict the performance and outcome of Solar Energy which in turns enables the balance of supply and demand on loads as well as enhance production planning and energy management.

Keywords: artificial Intelligence, backward elimination, linear regression, solar energy

Procedia PDF Downloads 157
27749 Transition from Linear to Circular Business Models with Service Design Methodology

Authors: Minna-Maari Harmaala, Hanna Harilainen

Abstract:

Estimates of the economic value of transitioning to circular economy models vary but it has been estimated to represent $1 trillion worth of new business into the global economy. In Europe alone, estimates claim that adopting circular-economy principles could not only have environmental and social benefits but also generate a net economic benefit of €1.8 trillion by 2030. Proponents of a circular economy argue that it offers a major opportunity to increase resource productivity, decrease resource dependence and waste, and increase employment and growth. A circular system could improve competitiveness and unleash innovation. Yet, most companies are not capturing these opportunities and thus the even abundant circular opportunities remain uncaptured even though they would seem inherently profitable. Service design in broad terms relates to developing an existing or a new service or service concept with emphasis and focus on the customer experience from the onset of the development process. Service design may even mean starting from scratch and co-creating the service concept entirely with the help of customer involvement. Service design methodologies provide a structured way of incorporating customer understanding and involvement in the process of designing better services with better resonance to customer needs. A business model is a depiction of how the company creates, delivers, and captures value; i.e. how it organizes its business. The process of business model development and adjustment or modification is also called business model innovation. Innovating business models has become a part of business strategy. Our hypothesis is that in addition to linear models still being easier to adopt and often with lower threshold costs, companies lack an understanding of how circular models can be adopted into their business and how customers will be willing and ready to adopt the new circular business models. In our research, we use robust service design methodology to develop circular economy solutions with two case study companies. The aim of the process is to not only develop the service concepts and portfolio, but to demonstrate the willingness to adopt circular solutions exists in the customer base. In addition to service design, we employ business model innovation methods to develop, test, and validate the new circular business models further. The results clearly indicate that amongst the customer groups there are specific customer personas that are willing to adopt and in fact are expecting the companies to take a leading role in the transition towards a circular economy. At the same time, there is a group of indifferents, to whom the idea of circularity provides no added value. In addition, the case studies clearly show what changes adoption of circular economy principles brings to the existing business model and how they can be integrated.

Keywords: business model innovation, circular economy, circular economy business models, service design

Procedia PDF Downloads 135
27748 Environmental Resilience in Sustainability Outcomes of Spatial-Economic Model Structure on the Topology of Construction Ecology

Authors: Moustafa Osman Mohammed

Abstract:

The resilient and sustainable of construction ecology is essential to world’s socio-economic development. Environmental resilience is crucial in relating construction ecology to topology of spatial-economic model. Sustainability of spatial-economic model gives attention to green business to comply with Earth’s System for naturally exchange patterns of ecosystems. The systems ecology has consistent and periodic cycles to preserve energy and materials flow in Earth’s System. When model structure is influencing communication of internal and external features in system networks, it postulated the valence of the first-level spatial outcomes (i.e., project compatibility success). These instrumentalities are dependent on second-level outcomes (i.e., participant security satisfaction). These outcomes of model are based on measuring database efficiency, from 2015 to 2025. The model topology has state-of-the-art in value-orientation impact and correspond complexity of sustainability issues (e.g., build a consistent database necessary to approach spatial structure; construct the spatial-economic model; develop a set of sustainability indicators associated with model; allow quantification of social, economic and environmental impact; use the value-orientation as a set of important sustainability policy measures), and demonstrate environmental resilience. The model is managing and developing schemes from perspective of multiple sources pollutants through the input–output criteria. These criteria are evaluated the external insertions effects to conduct Monte Carlo simulations and analysis for using matrices in a unique spatial structure. The balance “equilibrium patterns” such as collective biosphere features, has a composite index of the distributed feedback flows. These feedback flows have a dynamic structure with physical and chemical properties for gradual prolong of incremental patterns. While these structures argue from system ecology, static loads are not decisive from an artistic/architectural perspective. The popularity of system resilience, in the systems structure related to ecology has not been achieved without the generation of confusion and vagueness. However, this topic is relevant to forecast future scenarios where industrial regions will need to keep on dealing with the impact of relative environmental deviations. The model attempts to unify analytic and analogical structure of urban environments using database software to integrate sustainability outcomes where the process based on systems topology of construction ecology.

Keywords: system ecology, construction ecology, industrial ecology, spatial-economic model, systems topology

Procedia PDF Downloads 20
27747 Major Sucking Pests of Rose and Their Seasonal Abundance in Bangladesh

Authors: Md Ruhul Amin

Abstract:

This study was conducted in the experimental field of the Department of Entomology, Bangabandhu Sheikh Mujibur Rahman Agricultural University, Gazipur, Bangladesh during November 2017 to May 2018 with a view to understanding the seasonal abundance of the major sucking pests namely thrips, aphid and red spider mite on rose. The findings showed that the thrips started to build up their population from the middle of January with abundance 1.0 leaf⁻¹, increased continuously, reached to the peak level (2.6 leaf⁻¹) in the middle of February and then declined. Aphid started to build up their population from the second week of November with abundance 6.0 leaf⁻¹, increased continuously, reached to the peak level (8.4 leaf⁻¹) in the last week of December and then declined. Mite started to build up their population from the first week of December with abundance 0.8 leaf⁻¹, increased continuously, reached to the peak level (8.2 leaf⁻¹) in the second week of March and then declined. Thrips and mite prevailed until the last week of April, and aphid showed their abundance till last week of May. The daily mean temperature, relative humidity, and rainfall had an insignificant negative correlation with thrips and significant negative correlation with aphid abundance. The daily mean temperature had significant positive, relative humidity had an insignificant positive, and rainfall had an insignificant negative correlation with mite abundance. The multiple linear regression analysis showed that the weather parameters together contributed 38.1, 41.0 and 8.9% abundance on thrips, aphid and mite on rose, respectively and the equations were insignificant.

Keywords: aphid, mite, thrips, weather factors

Procedia PDF Downloads 162
27746 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 149