Search results for: real time kernel preemption
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20468

Search results for: real time kernel preemption

14798 Exchange Rate Forecasting by Econometric Models

Authors: Zahid Ahmad, Nosheen Imran, Nauman Ali, Farah Amir

Abstract:

The objective of the study is to forecast the US Dollar and Pak Rupee exchange rate by using time series models. For this purpose, daily exchange rates of US and Pakistan for the period of January 01, 2007 - June 2, 2017, are employed. The data set is divided into in sample and out of sample data set where in-sample data are used to estimate as well as forecast the models, whereas out-of-sample data set is exercised to forecast the exchange rate. The ADF test and PP test are used to make the time series stationary. To forecast the exchange rate ARIMA model and GARCH model are applied. Among the different Autoregressive Integrated Moving Average (ARIMA) models best model is selected on the basis of selection criteria. Due to the volatility clustering and ARCH effect the GARCH (1, 1) is also applied. Results of analysis showed that ARIMA (0, 1, 1 ) and GARCH (1, 1) are the most suitable models to forecast the future exchange rate. Further the GARCH (1,1) model provided the volatility with non-constant conditional variance in the exchange rate with good forecasting performance. This study is very useful for researchers, policymakers, and businesses for making decisions through accurate and timely forecasting of the exchange rate and helps them in devising their policies.

Keywords: exchange rate, ARIMA, GARCH, PAK/USD

Procedia PDF Downloads 544
14797 Hydrodynamics of Dual Hybrid Impeller of Stirred Reactor Using Radiotracer

Authors: Noraishah Othman, Siti K. Kamarudin, Norinsan K. Othman, Mohd S. Takriff, Masli I. Rosli, Engku M. Fahmi, Mior A. Khusaini

Abstract:

The present work describes hydrodynamics of mixing characteristics of two dual hybrid impeller consisting of, radial and axial impeller using radiotracer technique. Type A mixer, a Rushton turbine is mounted above a Pitched Blade Turbine (PBT) at common shaft and Type B mixer, a Rushton turbine is mounted below PBT. The objectives of this paper are to investigate the residence time distribution (RTD) of two hybrid mixers and to represent the respective mixers by RTD model. Each type of mixer will experience five radiotracer experiments using Tc99m as source of tracer and scintillation detectors NaI(Tl) are used for tracer detection. The results showed that mixer in parallel model and mixers in series with exchange can represent the flow model in mixer A whereas only mixer in parallel model can represent Type B mixer well than other models. In conclusion, Type A impeller, Rushton impeller above PBT, reduced the presence of dead zone in the mixer significantly rather than Type B.

Keywords: hybrid impeller, residence time distribution (RTD), radiotracer experiments, RTD model

Procedia PDF Downloads 337
14796 Blade-Coating Deposition of Semiconducting Polymer Thin Films: Light-To-Heat Converters

Authors: M. Lehtihet, S. Rosado, C. Pradère, J. Leng

Abstract:

Poly(3,4-ethylene dioxythiophene) polystyrene sulfonate (PEDOT: PSS), is a polymer mixture well-known for its semiconducting properties and is widely used in the coating industry for its visible transparency and high electronic conductivity (up to 4600 S/cm) as a transparent non-metallic electrode and in organic light-emitting diodes (OLED). It also possesses strong absorption properties in the Near Infra-Red (NIR) range (λ ranging between 900 nm to 2.5 µm). In the present work, we take advantage of this absorption to explore its potential use as a transparent light-to-heat converter. PEDOT: PSS aqueous dispersions are deposited onto a glass substrate using a blade-coating technique in order to produce uniform coatings with controlled thicknesses ranging in ≈ 400 nm to 2 µm. Blade-coating technique allows us good control of the deposit thickness and uniformity by the tuning of several experimental conditions (blade velocity, evaporation rate, temperature, etc…). This liquid coating technique is a well-known, non-expensive technique to realize thin film coatings on various substrates. For coatings on glass substrates destined to solar insulation applications, the ideal coating would be made of a material able to transmit all the visible range while reflecting the NIR range perfectly, but materials possessing similar properties still have unsatisfactory opacity in the visible too (for example, titanium dioxide nanoparticles). NIR absorbing thin films is a more realistic alternative for such an application. Under solar illumination, PEDOT: PSS thin films heat up due to absorption of NIR light and thus act as planar heaters while maintaining good transparency in the visible range. Whereas they screen some NIR radiation, they also generate heat which is then conducted into the substrate that re-emits this energy by thermal emission in every direction. In order to quantify the heating power of these coatings, a sample (coating on glass) is placed in a black enclosure and illuminated with a solar simulator, a lamp emitting a calibrated radiation very similar to the solar spectrum. The temperature of the rear face of the substrate is measured in real-time using thermocouples and a black-painted Peltier sensor measures the total entering flux (sum of transmitted and re-emitted fluxes). The heating power density of the thin films is estimated from a model of the thin film/glass substrate describing the system, and we estimate the Solar Heat Gain Coefficient (SHGC) to quantify the light-to-heat conversion efficiency of such systems. Eventually, the effect of additives such as dimethyl sulfoxide (DMSO) or optical scatterers (particles) on the performances are also studied, as the first one can alter the IR absorption properties of PEDOT: PSS drastically and the second one can increase the apparent optical path of light within the thin film material.

Keywords: PEDOT: PSS, blade-coating, heat, thin-film, Solar spectrum

Procedia PDF Downloads 146
14795 Designing a Method to Control and Determine the Financial Performance of the Real Cost Sub-System in the Information Management System of Construction Projects

Authors: Alireza Ghaffari, Hassan Saghi

Abstract:

Project management is more complex than managing the day-to-day affairs of an organization. When the project dimensions are broad and multiple projects have to be monitored in different locations, the integrated management becomes even more complicated. One of the main concerns of project managers is the integrated project management, which is mainly rooted in the lack of accurate and accessible information from different projects in various locations. The collection of dispersed information from various parts of the network, their integration and finally the selective reporting of this information is among the goals of integrated information systems. It can help resolve the main problem, which is bridging the information gap between executives and senior managers in the organization. Therefore, the main objective of this study is to design and implement an important subset of a project management information system in order to successfully control the cost of construction projects so that its results can be used to design raw software forms and proposed relationships between different project units for the collection of necessary information.

Keywords: financial performance, cost subsystem, PMIS, project management

Procedia PDF Downloads 92
14794 The Mechanisms of Peer-Effects in Education: A Frame-Factor Analysis of Instruction

Authors: Pontus Backstrom

Abstract:

In the educational literature on peer effects, attention has been brought to the fact that the mechanisms creating peer effects are still to a large extent hidden in obscurity. The hypothesis in this study is that the Frame Factor Theory can be used to explain these mechanisms. At heart of the theory is the concept of “time needed” for students to learn a certain curricula unit. The relations between class-aggregated time needed and the actual time available, steers and hinders the actions possible for the teacher. Further, the theory predicts that the timing and pacing of the teachers’ instruction is governed by a “criterion steering group” (CSG), namely the pupils in the 10th-25th percentile of the aptitude distribution in class. The class composition hereby set the possibilities and limitations for instruction, creating peer effects on individual outcomes. To test if the theory can be applied to the issue of peer effects, the study employs multilevel structural equation modelling (M-SEM) on Swedish TIMSS 2015-data (Trends in International Mathematics and Science Study; students N=4090, teachers N=200). Using confirmatory factor analysis (CFA) in the SEM-framework in MPLUS, latent variables are specified according to the theory, such as “limitations of instruction” from TIMSS survey items. The results indicate a good model fit to data of the measurement model. Research is still in progress, but preliminary results from initial M-SEM-models verify a strong relation between the mean level of the CSG and the latent variable of limitations on instruction, a variable which in turn have a great impact on individual students’ test results. Further analysis is required, but so far the analysis indicates a confirmation of the predictions derived from the frame factor theory and reveals that one of the important mechanisms creating peer effects in student outcomes is the effect the class composition has upon the teachers’ instruction in class.

Keywords: compositional effects, frame factor theory, peer effects, structural equation modelling

Procedia PDF Downloads 123
14793 Generalized Up-downlink Transmission using Black-White Hole Entanglement Generated by Two-level System Circuit

Authors: Muhammad Arif Jalil, Xaythavay Luangvilay, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin

Abstract:

Black and white holes form the entangled pair⟨BH│WH⟩, where a white hole occurs when the particle moves at the same speed as light. The entangled black-white hole pair is at the center with the radian between the gap. When the speed of particle motion is slower than light, the black hole is gravitational (positive gravity), where the white hole is smaller than the black hole. On the downstream side, the entangled pair appears to have a black hole outside the gap increases until the white holes disappear, which is the emptiness paradox. On the upstream side, when moving faster than light, white holes form times tunnels, with black holes becoming smaller. It will continue to move faster and further when the black hole disappears and becomes a wormhole (Singularity) that is only a white hole in emptiness (Emptiness). This research studies use of black and white holes generated by a two-level circuit for communication transmission carriers, in which high ability and capacity of data transmission can be obtained. The black and white hole pair can be generated by the two-level system circuit when the speech of a particle on the circuit is equal to the speed of light. The black hole forms when the particle speed has increased from slower to equal to the light speed, while the white hole is established when the particle comes down faster than light. They are bound by the entangled pair, signal and idler, ⟨Signal│Idler⟩, and the virtual ones for the white hole, which has an angular displacement of half of π radian. A two-level system is made from an electronic circuit to create black and white holes bound by the entangled bits that are immune or cloning-free from thieves. Start by creating a wave-particle behavior when its speed is equal to light black hole is in the middle of the entangled pair, which is the two bit gate. The required information can be input into the system and wrapped by the black hole carrier. A timeline (Tunnel) occurs when the wave-particle speed is faster than light, from which the entangle pair is collapsed. The transmitted information is safely in the time tunnel. The required time and space can be modulated via the input for the downlink operation. The downlink is established when the particle speed is given by a frequency(energy) form is down and entered into the entangled gap, where this time the white hole is established. The information with the required destination is wrapped by the white hole and retrieved by the clients at the destination. The black and white holes are disappeared, and the information can be recovered and used.

Keywords: cloning free, time machine, teleportation, two-level system

Procedia PDF Downloads 58
14792 Computer Aided Diagnostic System for Detection and Classification of a Brain Tumor through MRI Using Level Set Based Segmentation Technique and ANN Classifier

Authors: Atanu K Samanta, Asim Ali Khan

Abstract:

Due to the acquisition of huge amounts of brain tumor magnetic resonance images (MRI) in clinics, it is very difficult for radiologists to manually interpret and segment these images within a reasonable span of time. Computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of radiologists and reduce the time required for accurate diagnosis. An intelligent computer-aided technique for automatic detection of a brain tumor through MRI is presented in this paper. The technique uses the following computational methods; the Level Set for segmentation of a brain tumor from other brain parts, extraction of features from this segmented tumor portion using gray level co-occurrence Matrix (GLCM), and the Artificial Neural Network (ANN) to classify brain tumor images according to their respective types. The entire work is carried out on 50 images having five types of brain tumor. The overall classification accuracy using this method is found to be 98% which is significantly good.

Keywords: brain tumor, computer-aided diagnostic (CAD) system, gray-level co-occurrence matrix (GLCM), tumor segmentation, level set method

Procedia PDF Downloads 493
14791 A Stable Method for Determination of the Number of Independent Components

Authors: Yuyan Yi, Jingyi Zheng, Nedret Billor

Abstract:

Independent component analysis (ICA) is one of the most commonly used blind source separation (BSS) techniques for signal pre-processing, such as noise reduction and feature extraction. The main parameter in the ICA method is the number of independent components (IC). Although there have been several methods for the determination of the number of ICs, it has not been given sufficient attentionto this important parameter. In this study, wereview the mostused methods fordetermining the number of ICs and providetheir advantages and disadvantages. Further, wepropose an improved version of column-wise ICAByBlock method for the determination of the number of ICs.To assess the performance of the proposed method, we compare the column-wise ICAbyBlock with several existing methods through different ICA methods by using simulated and real signal data. Results show that the proposed column-wise ICAbyBlock is an effective and stable method for determining the optimal number of components in ICA. This method is simple, and results can be demonstrated intuitively with good visualizations.

Keywords: independent component analysis, optimal number, column-wise, correlation coefficient, cross-validation, ICAByblock

Procedia PDF Downloads 83
14790 Evaluation of Robust Feature Descriptors for Texture Classification

Authors: Jia-Hong Lee, Mei-Yi Wu, Hsien-Tsung Kuo

Abstract:

Texture is an important characteristic in real and synthetic scenes. Texture analysis plays a critical role in inspecting surfaces and provides important techniques in a variety of applications. Although several descriptors have been presented to extract texture features, the development of object recognition is still a difficult task due to the complex aspects of texture. Recently, many robust and scaling-invariant image features such as SIFT, SURF and ORB have been successfully used in image retrieval and object recognition. In this paper, we have tried to compare the performance for texture classification using these feature descriptors with k-means clustering. Different classifiers including K-NN, Naive Bayes, Back Propagation Neural Network , Decision Tree and Kstar were applied in three texture image sets - UIUCTex, KTH-TIPS and Brodatz, respectively. Experimental results reveal SIFTS as the best average accuracy rate holder in UIUCTex, KTH-TIPS and SURF is advantaged in Brodatz texture set. BP neuro network works best in the test set classification among all used classifiers.

Keywords: texture classification, texture descriptor, SIFT, SURF, ORB

Procedia PDF Downloads 350
14789 Modeling and Optimization of Algae Oil Extraction Using Response Surface Methodology

Authors: I. F. Ejim, F. L. Kamen

Abstract:

Aims: In this experiment, algae oil extraction with a combination of n-hexane and ethanol was investigated. The effects of extraction solvent concentration, extraction time and temperature on the yield and quality of oil were studied using Response Surface Methodology (RSM). Experimental Design: Optimization of algae oil extraction using Box-Behnken design was used to generate 17 experimental runs in a three-factor-three-level design where oil yield, specific gravity, acid value and saponification value were evaluated as the response. Result: In this result, a minimum oil yield of 17% and maximum of 44% was realized. The optimum values for yield, specific gravity, acid value and saponification value from the overlay plot were 40.79%, 0.8788, 0.5056 mg KOH/g and 180.78 mg KOH/g respectively with desirability of 0.801. The maximum point prediction was yield 40.79% at solvent concentration 66.68 n-hexane, temperature of 40.0°C and extraction time of 4 hrs. Analysis of Variance (ANOVA) results showed that the linear and quadratic coefficient were all significant at p<0.05. The experiment was validated and results obtained were with the predicted values. Conclusion: Algae oil extraction was successfully optimized using RSM and its quality indicated it is suitable for many industrial uses.

Keywords: algae oil, response surface methodology, optimization, Box-Bohnken, extraction

Procedia PDF Downloads 317
14788 Decision Support Tool for Green Roofs Selection: A Multicriteria Analysis

Authors: I. Teotónio, C.O. Cruz, C.M. Silva, M. Manso

Abstract:

Diverse stakeholders show different concerns when choosing green roof systems. Also, green roof solutions vary in their cost and performance. Therefore, decision-makers continually face the difficult task of balancing benefits against green roofs costs. Decision analysis methods, as multicriteria analysis, can be used when the decision‑making process includes different perspectives, multiple objectives, and uncertainty. The present study adopts a multicriteria decision model to evaluate the installation of green roofs in buildings, determining the solution with the best trade-off between costs and benefits in agreement with the preferences of the users/investors. This methodology was applied to a real decision problem, assessing the preferences between different green roof systems in an existing building in Lisbon. This approach supports the decision-making process on green roofs and enables robust and informed decisions on urban planning while optimizing buildings retrofitting.

Keywords: decision making, green roofs, investors preferences, multicriteria analysis, sustainable development

Procedia PDF Downloads 167
14787 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling

Procedia PDF Downloads 414
14786 The Effectiveness of Using Dramatic Conventions as the Teaching Strategy on Self-Efficacy for Children With Autism Spectrum Disorder

Authors: Tso Sheng-Yang, Wang Tien-Ni

Abstract:

Introduction and Purpose: Previous researchers have documented children with ASD (Autism Spectrum Disorders) prefer to escaping internal privates and external privates when they face tough conditions they can’t control or they don’t like.Especially, when children with ASD need to learn challenging tasks, such us Chinese language, their inappropriate behaviors will occur apparently. Recently, researchers apply positive behavior support strategies for children with ASD to enhance their self-efficacy and therefore to reduce their adverse behaviors. Thus, the purpose of this research was to design a series of lecture based on art therapy and to evaluate its effectiveness on the child’s self-efficacy. Method: This research was the single-case design study that recruited a high school boy with ASD. Whole research can be separated into three conditions. First, baseline condition, before the class started and ended, the researcher collected participant’s competencies of self-efficacy every session. In intervention condition, the research used dramatic conventions to teach the child in Chinese language twice a week.When the data was stable across three documents, the period entered to the maintenance condition. In maintenance condition, the researcher only collected the score of self-efficacynot to do other interventions five times a month to represent the effectiveness of maintenance.The time and frequency of data collection among three conditions are identical. Concerning art therapy, the common approach, e.g., music, drama, or painting is to use art medium as independent variable. Due to visual cues of art medium, the ASD can be easily to gain joint attention with teachers. Besides, the ASD have difficulties in understanding abstract objectives Thus, using the drama convention is helpful for the ASD to construct the environment and understand the context of Classical Chinese. By real operation, it can improve the ASD to understand the context and construct prior knowledge. Result: Bassd on the 10-points Likert scale and research, we product following results. (a) In baseline condition, the average score of self-efficacyis 1.12 points, rangedfrom 1 to 2 points, and the level change is 0 point. (b)In intervention condition, the average score of self-efficacy is 7.66 points rangedfrom 7 to 9 points, and the level change is 1 point. (c)In maintenance condition, the average score of self-efficacy is 6.66 points rangedfrom 6 to 7 points, and the level change is 1 point. Concerning immediacy of change, between baseline and intervention conditions, the difference is 5 points. No overlaps were found between these two conditions. Conclusion: According to the result, we find that it is effective that using dramatic conventions a s teaching strategies to teach children with ASD. The result presents the score of self-efficacyimmediately enhances when the dramatic conventions commences. Thus, we suggest the teacher can use this approach and adjust, based on the student’s trait, to teach the ASD on difficult task.

Keywords: dramatic conventions, autism spectrum disorder, slef-efficacy, teaching strategy

Procedia PDF Downloads 70
14785 The Design of Acoustic Horns for Ultrasonic Aided Tube Double Side Flange Making

Authors: Kuen-Ming Shu, Jyun-Wei Chen

Abstract:

Encapsulated O-rings are specifically designed to address the problem of sealing the most hostile chemicals and extreme temperature applications. Ultrasonic vibration hot embossing and ultrasonic welding techniques provide a fast and reliable method to fabricate encapsulated O-ring. This paper performs the design and analysis method of the acoustic horns with double extrusion to process tube double side flange simultaneously. The paper deals with study through Finite Element Method (FEM) of ultrasonic stepped horn used to process a capsulated O-ring, the theoretical dimensions of horns, and their natural frequencies and amplitudes are obtained through the simulations of COMOSOL software. Furthermore, real horns were fabricated, tested and verified to proof the practical utility of these horns.

Keywords: encapsulated O-rings, ultrasonic vibration hot embossing, flange making, acoustic horn, finite element analysis

Procedia PDF Downloads 307
14784 Material Properties Evolution Affecting Demisability for Space Debris Mitigation

Authors: Chetan Mahawar, Sarath Chandran, Sridhar Panigrahi, V. P. Shaji

Abstract:

The ever-growing advancement in space exploration has led to an alarming concern for space debris removal as it restricts further launch operations and adventurous space missions; hence numerous studies have come up with technologies for re-entry predictions and material selection processes for mitigating space debris. The selection of material and operating conditions is determined with the objective of lightweight structure and ability to demise faster subject to spacecraft survivability during its mission. Since the demisability of spacecraft depends on evolving thermal material properties such as emissivity, specific heat capacity, thermal conductivity, radiation intensity, etc. Therefore, this paper presents the analysis of evolving thermal material properties of spacecraft, which affect the demisability process and thus estimate demise time using the demisability model by incorporating evolving thermal properties for sensible heating followed by the complete or partial break-up of spacecraft. The demisability analysis thus concludes the best suitable spacecraft material is based on the least estimated demise time, which fulfills the criteria of design-for-survivability and as well as of design-for-demisability.

Keywords: demisability, emissivity, lightweight, re-entry, survivability

Procedia PDF Downloads 98
14783 Aberrant Acetylation/Methylation of Homeobox (HOX) Family Genes in Cumulus Cells of Infertile Women with Polycystic Ovary Syndrome (PCOS)

Authors: P. Asiabi, M. Shahhoseini, R. Favaedi, F. Hassani, N. Nassiri, B. Movaghar, L. Karimian, P. Eftekhariyazdi

Abstract:

Introduction: Polycystic Ovary Syndrome is a common gynecologic disorder. Many factors including environment, metabolism, hormones and genetics are involved in etiopathogenesis of PCOS. Of genes that have altered expression in human reproductive system disorders are HOX family genes which act as transcription factors in regulation of cell proliferation, differentiation, adhesion and migration. Since recent evidences consider epigenetic factors as causative mechanisms of PCOS, evaluation of association between known epigenetic marks of acetylation/methylation of histone 3 (H3K9ac/me) with regulatory regions of these genes can represent better insight about PCOS. In the current study, cumulus cells (CCs) which have critical roles during folliculogenesis, oocyte maturation, ovulation and fertilization were aimed to monitor epigenetic alterations of HOX genes. Material and methods: CCs were collected from 20 PCOS patients and 20 fertile women (18-36 year) with male infertility problems referred to the Royan Institute to have ICSI under GnRH antagonist protocol. Informed consents were obtained from the participants. Thirty six hours after hCG injection, ovaries were punctured and cumulus oocyte complexes were dissected. Soluble chromatin were extracted from CCs and Chromatin Immune precipitation (ChIP) coupled with Real Time PCR was performed to quantify the epigenetic marks of histone H3K9 acetylation/methylation (H3K9ac/me) on regulatory regions of 15 members of HOX genes from A-D subfamily. Results: Obtained data showed significant increase of H3K9ac epigenetic mark on regulatory regions of HOXA1, HOXB2, HOXC4, HOXD1, HOXD3 and HOXD4 (P < 0.01) and HOXC5 (P < 0.05) and also significant decrease of H3K9ac into regulatory regions of HOXA2, HOXA4, HOXA5, HOXB1 and HOXB5 (P < 0.01) and HOXB3 (P<0.05) in PCOS patients vs. control group. On the other side, there was a significant decrease in incorporation of H3K9me level on regulatory region of HOXA2, HOXA3, HOXA4, HOXA5, HOXB3 and HOXC4 (P≤0.01) and HOXB5 (P < 0.05) in PCOS patients vs. control group. This epigenetic mark (H3K9me2) has significant increase on regulatory region of HOXB1, HOXB2, HOXC5, HOXD1, HOXD3 and HOXD4 (P ≤ 0.01) and HOXB4 (P < 0.05) in patients vs. control group. There were no significant changes in acetylation/methylation levels of H3K9 on regulatory regions of the other studied genes. Conclusion: Current study suggests that epigenetic alterations of HOX genes can be correlated with PCOS and consequently female infertility. This finding might offer additional definitions of PCOS, and eventually provides insight for novel treatments with epidrugs for this disease.

Keywords: epigenetic, HOX genes, PCOS, female infertility

Procedia PDF Downloads 306
14782 Experimental Field for the Study of Soil-Atmosphere Interaction in Soft Soils

Authors: Andres Mejia-Ortiz, Catalina Lozada, German R. Santos, Rafael Angulo-Jaramillo, Bernardo Caicedo

Abstract:

The interaction between atmospheric variables and soil properties is a determining factor when evaluating the flow of water through the soil. This interaction situation directly determines the behavior of the soil and greatly influences the changes that occur in it. The atmospheric variations such as changes in the relative humidity, air temperature, wind velocity and precipitation, are the external variables that reflect a greater incidence in the changes that are generated in the subsoil, as a consequence of the water flow in descending and ascending conditions. These environmental variations have a major importance in the study of the soil because the conditions of humidity and temperature in the soil surface depend on them. In addition, these variations control the thickness of the unsaturated zone and the position of the water table with respect to the surface. However, understanding the relationship between the atmosphere and the soil is a somewhat complex aspect. This is mainly due to the difficulty involved in estimating the changes that occur in the soil from climate changes; since this is a coupled process where act processes of mass transfer and heat. In this research, an experimental field was implemented to study in-situ the interaction between the atmosphere and the soft soils of the city of Bogota, Colombia. The soil under study consists of a 60 cm layer composed of two silts of similar characteristics at the surface and a deep soft clay deposit located under the silky material. It should be noted that the vegetal layer and organic matter were removed to avoid the evapotranspiration phenomenon. Instrumentation was carried on in situ through a field disposal of many measuring devices such as soil moisture sensors, thermocouples, relative humidity sensors, wind velocity sensor, among others; which allow registering the variations of both the atmospheric variables and the properties of the soil. With the information collected through field monitoring, the water balances were made using the Hydrus-1D software to determine the flow conditions that developed in the soil during the study. Also, the moisture profile for different periods and time intervals was determined by the balance supplied by Hydrus 1D; this profile was validated by experimental measurements. As a boundary condition, the actual evaporation rate was included using the semi-empirical equations proposed by different authors. In this study, it was obtained for the rainy periods a descending flow that was governed by the infiltration capacity of the soil. On the other hand, during dry periods. An increase in the actual evaporation of the soil induces an upward flow of water, increasing suction due to the decrease in moisture content. Also, cracks were developed accelerating the evaporation process. This work concerns to the study of soil-atmosphere interaction through the experimental field and it is a very useful tool since it allows considering all the factors and parameters of the soil in its natural state and real values of the different environmental conditions.

Keywords: field monitoring, soil-atmosphere, soft soils, soil-water balance

Procedia PDF Downloads 124
14781 A Practical Construction Technique to Enhance the Performance of Rock Bolts in Tunnels

Authors: Ojas Chaudhari, Ali Nejad Ghafar, Giedrius Zirgulis, Marjan Mousavi, Tommy Ellison, Sandra Pousette, Patrick Fontana

Abstract:

In Swedish tunnel construction, a critical issue that has been repeatedly acknowledged is corrosion and, consequently, failure of the rock bolts in rock support systems. The defective installation of rock bolts results in the formation of cavities in the cement mortar that is regularly used to fill the area under the dome plates. These voids allow for water-ingress to the rock bolt assembly, which results in corrosion of rock bolt components and eventually failure. In addition, the current installation technique consists of several manual steps with intense labor works that are usually done in uncomfortable and exhausting conditions, e.g., under the roof of the tunnels. Such intense tasks also lead to a considerable waste of materials and execution errors. Moreover, adequate quality control of the execution is hardly possible with the current technique. To overcome these issues, a non-shrinking/expansive cement-based mortar filled in the paper packaging has been developed in this study which properly fills the area under the dome plates without or with the least remaining cavities, ultimately that diminishes the potential of corrosion. This article summarizes the development process and the experimental evaluation of this technique for the installation of rock bolts. In the development process, the cementitious mortar was first developed using specific cement and shrinkage reducing/expansive additives. The mechanical and flow properties of the mortar were then evaluated using compressive strength, density, and slump flow measurement methods. In addition, isothermal calorimetry and shrinkage/expansion measurements were used to elucidate the hydration and durability attributes of the mortar. After obtaining the desired properties in both fresh and hardened conditions, the developed dry mortar was filled in specific permeable paper packaging and then submerged in water bath for specific intervals before the installation. The tests were enhanced progressively by optimizing different parameters such as shape and size of the packaging, characteristics of the paper used, immersion time in water and even some minor characteristics of the mortar. Finally, the developed prototype was tested in a lab-scale rock bolt assembly with various angles to analyze the efficiency of the method in real life scenario. The results showed that the new technique improves the performance of the rock bolts by reducing the material wastage, improving environmental performance, facilitating and accelerating the labor works, and finally enhancing the durability of the whole system. Accordingly, this approach provides an efficient alternative for the traditional way of tunnel bolt installation with considerable advantages for the Swedish tunneling industry.

Keywords: corrosion, durability, mortar, rock bolt

Procedia PDF Downloads 96
14780 Implementation of Computer-Based Technologies into Foreign Language Teaching Process

Authors: Golovchun Aleftina, Dabyltayeva Raikhan

Abstract:

Nowadays, in the world of widely developing cross-cultural interactions and rapidly changing demands of the global labor market, foreign language teaching and learning has taken a special role not only in school education but also in everyday life. Cognitive Lingua-Cultural Methodology of Foreign Language Teaching originated in Kazakhstan brings a communicative approach to the forefront in foreign language teaching that gives raise a variety of techniques to make the language learning a real communication. One of these techniques is Computer Assisted Language Learning. In our article, we aim to: demonstrate what learning benefits students are likely to get by teachers having implemented computer-based technologies into foreign language teaching process; prove that technology-based classroom serves as the best tool for interactive and efficient language learning; give examples of classroom sufficient organization with computer-based activities.

Keywords: computer assisted language learning, learning benefits, foreign language teaching process, implementation, communicative approach

Procedia PDF Downloads 454
14779 Evaluating the Total Costs of a Ransomware-Resilient Architecture for Healthcare Systems

Authors: Sreejith Gopinath, Aspen Olmsted

Abstract:

This paper is based on our previous work that proposed a risk-transference-based architecture for healthcare systems to store sensitive data outside the system boundary, rendering the system unattractive to would-be bad actors. This architecture also allows a compromised system to be abandoned and a new system instance spun up in place to ensure business continuity without paying a ransom or engaging with a bad actor. This paper delves into the details of various attacks we simulated against the prototype system. In the paper, we discuss at length the time and computational costs associated with storing and retrieving data in the prototype system, abandoning a compromised system, and setting up a new instance with existing data. Lastly, we simulate some analytical workloads over the data stored in our specialized data storage system and discuss the time and computational costs associated with running analytics over data in a specialized storage system outside the system boundary. In summary, this paper discusses the total costs of data storage, access, and analytics incurred with the proposed architecture.

Keywords: cybersecurity, healthcare, ransomware, resilience, risk transference

Procedia PDF Downloads 119
14778 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 38
14777 Exploratory Tests on Structures Resistance during Forest Fires

Authors: Luis M. Ribeiro, Jorge Raposo, Ricardo Oliveira, David Caballero, Domingos X. Viegas

Abstract:

Under the scope of European project WUIWATCH a set of experimental tests on house vulnerability was performed in order to assess the resistance of selected house components during the passage of a forest fire. Among the individual elements most affected by the passage of a wildfire the windows are the ones with greater exposure. In this sense, a set of exploratory experimental tests was designed to assess some particular aspects related to the vulnerability of windows and blinds. At the same time, the importance of leaving them closed (as well as the doors inside a house) during a wild fire was explored in order to give some scientific background to guidelines for homeowners. Three sets of tests were performed: 1. Windows and blinds resistance to heat. Three types of protective blinds were tested (aluminium, PVC and wood) on 2 types of windows (single and double pane). The objective was to assess the structures resistance. 2. The influence of air flow on the transport of burning embers inside a house. A room was built to scale, and placed inside a wind tunnel, with one window and one door on opposite sides. The objective was to assess the importance of leaving an inside door opened on the probability of burning embers entering the room. 3. The influence of the dimension of openings on a window or door related to the probability of ignition inside a house. The objective was to assess the influence of different window openings in relation to the amount of burning particles that can enter a house. The main results were: 1. The purely radiative heat source provides 1.5 KW/m2 of heat impact in the structure, while the real fire generates 10 Kw/m2. When protected by the blind, the single pane window reaches 30ºC on both sides, and the double pane window has a differential of 10º from the side facing the heat (30ºC) and the opposite side (40ºC). Unprotected window constantly increases temperature until the end of the test. Window blinds reach considerably higher temperatures. PVC loses its consistency above 150ºC and melts. 2. Leaving the inside door closed results in a positive pressure differential of +1Pa from the outside to the inside, inhibiting the air flow. Opening the door in half or full reverts the pressure differential to -6 and -8 times respectively, favouring the air flow from the outside to the inside. The number of particles entering the house follows the same tendency. 3. As the bottom opening in a window increases from 0,5 cm to 4 cm the number of particles that enter the house per second also increases greatly. From 5 cm until 80cm there is no substantial increase in the number of entering particles. This set of exploratory tests proved to be an added value in supporting guidelines for home owners, regarding self-protection in WUI areas.

Keywords: forest fire, wildland urban interface, house vulnerability, house protective elements

Procedia PDF Downloads 272
14776 Association between a Forward Lag of Historical Total Accumulated Gasoline Lead Emissions and Contemporary Autism Prevalence Trends in California, USA

Authors: Mark A. S. Laidlaw, Howard W. Mielke

Abstract:

In California between the late 1920’s and 1986 the lead concentrations in urban soils and dust climbed rapidly following the deposition of greater than 387,000 tonnes of lead emitted from gasoline. Previous research indicates that when children are lead exposed around 90% of the lead is retained in their bones and teeth due to the substitution of lead for calcium. Lead in children’s bones has been shown to accumulate over time and is highest in inner-city urban areas, lower in suburban areas and lowest in rural areas. It is also known that women’s bones demineralize during pregnancy due to the foetus's high demand for calcium. Lead accumulates in women’s bones during childhood and the accumulated lead is subsequently released during pregnancy – a lagged response. This results in calcium plus lead to enter the blood stream and cross the placenta to expose the foetus with lead. In 1970 in the United States, the average age of a first‐time mother was about 21. In 2008, the average age was 25.1. In this study, it is demonstrated that in California there is a forward lagged relationship between the accumulated emissions of lead from vehicle fuel additives and later autism prevalence trends between the 1990’s and current time period. Regression analysis between a 24 year forward lag of accumulated lead emissions and autism prevalence trends in California are associated strongly (R2=0.95, p=0.00000000127). It is hypothesized that autism in genetically susceptible children may stem from vehicle fuel lead emission exposures of their mothers during childhood and that the release of stored lead during subsequent pregnancy resulted in lead exposure of foetuses during a critical developmental period. It is furthermore hypothesized that the 24 years forward lag between lead exposures has occurred because that is time period is the average length for women to enter childbearing age. To test the hypothesis that lead in mothers bones is associated with autism, it is hypothesized that retrospective case-control studies would show an association between the lead in mother’s bones and autism. Furthermore, it is hypothesized that the forward lagged relationship between accumulated historical vehicle fuel lead emissions (or air lead concentrations) and autism prevalence trends will be similar in cities at the national and international scale. If further epidemiological studies indicate a strong relationship between accumulated vehicle fuel lead emissions (or accumulated air lead concentrations) and lead in mother’s bones and autism rates, then urban areas may require extensive soil intervention to prevent the development of autism in children.

Keywords: autism, bones, lead, gasoline, petrol, prevalence

Procedia PDF Downloads 286
14775 Comparison Methyl Orange and Malachite Green Dyes Removal by GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH as Adsorbents

Authors: Omid Moradi, Mostafa Rajabi

Abstract:

Graphene oxide (GO), reduced graphene oxide (rGO), multi-walled carbon nanotubes MWCNT), multi-walled carbon nanotube functionalized carboxyl (MWCNT-COOH), and multi-walled carbon nanotube functionalized thiol (MWCNT-SH) were used as efficient adsorbents for the rapid removal two dyes methyl orange (MO) and malachite green (MG) from the aqueous phase. The impact of several influential parameters such as initial dye concentrations, contact time, temperature, and initial solution pH was well studied and optimized. The optimize time for adsorption process of methyl orange dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were determined at 100, 100, 60, 25, and 60 min, respectively and The optimize time for adsorption process of malachite green dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were determined at 100, 100, 60, 15, and 60 min, respectively. The maximum removal efficiency for methyl orange dye by GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were occurred at optimized pH 3, 3, 6, 2, and 6 of aqueous solutions, respectively and for malachite green dye were occurred at optimized pH 3, 3, 6, 9, and 6 of aqueous solutions, respectively. The effect of temperature showed that adsorption process of malachite green dye on GO, rGO, MWCNT, and MWCNT-SH surfaces were endothermic and for adsorption process of methyl orange dye on GO, rGO, MWCNT, and MWCNT-SH surfaces were endothermic but while adsorption of methyl orange and malachite green dyes on MWCNT-COOH surface were exothermic.On increasing the initial concentration of methyl orange dye adsorption capacity on GO surface was decreased and on rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were increased and with increasing the initial concentration of malachite green dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were increased.

Keywords: adsorption, graphene oxide, reduced graphene oxide, multi-walled carbon nanotubes, methyl orange, malachite green, removal

Procedia PDF Downloads 361
14774 A New Method to Estimate the Low Income Proportion: Monte Carlo Simulations

Authors: Encarnación Álvarez, Rosa M. García-Fernández, Juan F. Muñoz

Abstract:

Estimation of a proportion has many applications in economics and social studies. A common application is the estimation of the low income proportion, which gives the proportion of people classified as poor into a population. In this paper, we present this poverty indicator and propose to use the logistic regression estimator for the problem of estimating the low income proportion. Various sampling designs are presented. Assuming a real data set obtained from the European Survey on Income and Living Conditions, Monte Carlo simulation studies are carried out to analyze the empirical performance of the logistic regression estimator under the various sampling designs considered in this paper. Results derived from Monte Carlo simulation studies indicate that the logistic regression estimator can be more accurate than the customary estimator under the various sampling designs considered in this paper. The stratified sampling design can also provide more accurate results.

Keywords: poverty line, risk of poverty, auxiliary variable, ratio method

Procedia PDF Downloads 439
14773 Harnessing Artificial Intelligence for Early Detection and Management of Infectious Disease Outbreaks

Authors: Amarachukwu B. Isiaka, Vivian N. Anakwenze, Chinyere C. Ezemba, Chiamaka R. Ilodinso, Chikodili G. Anaukwu, Chukwuebuka M. Ezeokoli, Ugonna H. Uzoka

Abstract:

Infectious diseases continue to pose significant threats to global public health, necessitating advanced and timely detection methods for effective outbreak management. This study explores the integration of artificial intelligence (AI) in the early detection and management of infectious disease outbreaks. Leveraging vast datasets from diverse sources, including electronic health records, social media, and environmental monitoring, AI-driven algorithms are employed to analyze patterns and anomalies indicative of potential outbreaks. Machine learning models, trained on historical data and continuously updated with real-time information, contribute to the identification of emerging threats. The implementation of AI extends beyond detection, encompassing predictive analytics for disease spread and severity assessment. Furthermore, the paper discusses the role of AI in predictive modeling, enabling public health officials to anticipate the spread of infectious diseases and allocate resources proactively. Machine learning algorithms can analyze historical data, climatic conditions, and human mobility patterns to predict potential hotspots and optimize intervention strategies. The study evaluates the current landscape of AI applications in infectious disease surveillance and proposes a comprehensive framework for their integration into existing public health infrastructures. The implementation of an AI-driven early detection system requires collaboration between public health agencies, healthcare providers, and technology experts. Ethical considerations, privacy protection, and data security are paramount in developing a framework that balances the benefits of AI with the protection of individual rights. The synergistic collaboration between AI technologies and traditional epidemiological methods is emphasized, highlighting the potential to enhance a nation's ability to detect, respond to, and manage infectious disease outbreaks in a proactive and data-driven manner. The findings of this research underscore the transformative impact of harnessing AI for early detection and management, offering a promising avenue for strengthening the resilience of public health systems in the face of evolving infectious disease challenges. This paper advocates for the integration of artificial intelligence into the existing public health infrastructure for early detection and management of infectious disease outbreaks. The proposed AI-driven system has the potential to revolutionize the way we approach infectious disease surveillance, providing a more proactive and effective response to safeguard public health.

Keywords: artificial intelligence, early detection, disease surveillance, infectious diseases, outbreak management

Procedia PDF Downloads 47
14772 Bayesian Network and Feature Selection for Rank Deficient Inverse Problem

Authors: Kyugneun Lee, Ikjin Lee

Abstract:

Parameter estimation with inverse problem often suffers from unfavorable conditions in the real world. Useless data and many input parameters make the problem complicated or insoluble. Data refinement and reformulation of the problem can solve that kind of difficulties. In this research, a method to solve the rank deficient inverse problem is suggested. A multi-physics system which has rank deficiency caused by response correlation is treated. Impeditive information is removed and the problem is reformulated to sequential estimations using Bayesian network (BN) and subset groups. At first, subset grouping of the responses is performed. Feature selection with singular value decomposition (SVD) is used for the grouping. Next, BN inference is used for sequential conditional estimation according to the group hierarchy. Directed acyclic graph (DAG) structure is organized to maximize the estimation ability. Variance ratio of response to noise is used to pairing the estimable parameters by each response.

Keywords: Bayesian network, feature selection, rank deficiency, statistical inverse analysis

Procedia PDF Downloads 298
14771 The Effect of Connections Form on Seismic Behavior of Portal Frames

Authors: Kiavash Heidarzadeh

Abstract:

The seismic behavior of portal frames is mainly based on the shape of their joints. In these structures, vertical and inclined connections are the two general forms of connections. The shapes of connections can make differences in seismic responses of portal frames. Hence, in this paper, for the first step, the non-linear performance of portal frames with vertical and inclined connections has been investigated by monotonic analysis. Also, the effect of section sizes is considered in this analysis. For comparison, hysteresis curves have been evaluated for two model frames with different forms of connections. Each model has three various sizes of the column and beam. Other geometrical parameters have been considered constant. In the second step, for every model, an appropriate size of sections has been selected from the previous step. Next, the seismic behavior of each model has been analyzed by the time history method under three near-fault earthquake records. Finite element ABAQUS software is used for simulation and analysis of samples. Outputs show that connections form can impact on reaction forces of portal frames under earthquake loads. Also, it is understood that the load capacity in frames with vertical connections is more than the frames with inclined connections.

Keywords: inclined connections, monotonic, portal frames, seismic behavior, time history, vertical connections

Procedia PDF Downloads 215
14770 Thermal Regions for Unmanned Aircraft Systems Route Planning

Authors: Resul Fikir

Abstract:

Unmanned Aircraft Systems (UAS) become indispensable parts of modern air power as force multiplier. One of the main advantages of UAS is long endurance. UAS have to take extra payloads to accomplish different missions but these payloads decrease endurance of aircraft because of increasing drag. There are continuing researches to increase the capability of UAS. There are some vertical thermal air currents, which can cause climb and increase endurance, in nature. Birds and gliders use thermals to gain altitude with no effort. UAS have wide wing which can use of thermals like birds and gliders. Thermal regions, which is area of 2000-3000 meter (1 NM), exist all around the world. It is free and clean source. This study analyses if thermal regions can be adopted and implemented as an assistant tool for UAS route planning. First and second part of study will contain information about the thermal regions and current applications about UAS in aviation and climbing performance with a real example. Continuing parts will analyze the contribution of thermal regions to UAS endurance. Contribution is important because planning declaration of UAS navigation rules will be in 2015.

Keywords: airways, thermals, UAS, UAS roadmap

Procedia PDF Downloads 409
14769 Increasing Added-Value of Salak Fruit by Freezing Frying to Improve the Welfare of Farmers: Case Study of Sleman Regency, Yogyakarta-Indonesia

Authors: Sucihatiningsih Dian Wisika Prajanti, Himawan Arif Susanto

Abstract:

Fruits are perishable products and have relatively low price, especially at harvest time. Generally, farmers only sell the products shortly after the harvest time without any processing. Farmers also only play role as price takers leading them to have less power to set the price. Sometimes, farmers are manipulated by middlemen, especially during abundant harvest. Therefore, it requires an effort to cultivate fruits and create innovation to make them more durable and have higher economic value. The purpose of this research is how to increase the added- value of fruits that have high economic value. The research involved 60 farmers of Salak fruit as the sample. Then, descriptive analysis was used to analyze the data in this study. The results showed the selling price of Salak fruit is very low. Hence, to increase the added-value of the fruits, fruit processing is carried out by freezing - frying which can cause the fruits last longer. In addition to increase these added-value, the products can be accommodated for further processed without worrying about their crops rotted or unsold.

Keywords: fruits processing, Salak fruit, freezing frying, farmer’s welfare, Sleman, Yogyakarta

Procedia PDF Downloads 334