Search results for: project lead time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24425

Search results for: project lead time

16925 Monitoring Saltwater Corrosion on Steel Samples Using Coda Wave Interferometry in MHZ Frequencies

Authors: Maxime Farin, Emmanuel Moulin, Lynda Chehami, Farouk Benmeddour, Pierre Campistron

Abstract:

Assessing corrosion is crucial in the petrochemical and marine industry. Usual ultrasonic methods based on guided waves to detect corrosion can inspect large areas but lack precision. We propose a complementary and sensitive ultrasonic method (~ 10 MHz) based on coda wave interferometry to detect and quantify corrosion at the surface of a steel sample. The method relies on a single piezoelectric transducer, exciting the sample and measuring the scattered coda signals at different instants in time. A laboratory experiment is conducted with a steel sample immersed in salted water for 60~h with parallel coda and temperature measurements to correct coda dependence to temperature variations. Micrometric changes to the sample surface caused by corrosion are detected in the late coda signals, allowing precise corrosion detection. Moreover, a good correlation is found between a parameter quantifying the temperature-corrected stretching of the coda over time with respect to a reference without corrosion and the corrosion surface over the sample recorded with a camera.

Keywords: coda wave interferometry, nondestructive evaluation, corrosion, ultrasonics

Procedia PDF Downloads 218
16924 Effect of Smoking on Tear Break-Up Time and Basal Tear Secretion

Authors: Kalsoom Rani

Abstract:

Tobacco contains nicotine, which causes addiction to many toxic chemicals. In the world, people consume it in the form of smoke, chew, and sniffing, smoke of it is composed of almost 7000 active chemicals, which are very harmful to human health as well as for eye health, inhalation of tobacco smoke and fumes can accelerate and cause many blinding eye diseases. Dry eye and smoking have not been covered extensively in researches; more studies are required to unveil the relationship between smoking and dry eye. This study was conducted to determine the quantity and quality of tears in smokers. 60 subjects participated in the study, which was divided into two groups on the basis of consumption of cigarettes per day with age matched non smokers of 15-50 years. All participants have gone through a study based questioner, eye examination, and diagnostic 'Dry Eye Tests' for evaporative tears evaluation and measurement of basal tear secretion. Subjects were included in the criteria of 10 cigarettes per day with a minimum duration of 1 year; passive smokers for control groups were excluded. The study was carried out in a Medina Teaching Hospital, Faisalabad, Pakistan, ophthalmology department for the duration of 8 months. Mean values for tear break up time (TBUT), was reported 10sec with SD of +3.74 in controlled group, 5sec with SD + 2.32 in smokers and 4sec SD +3.77 heavy smokers in right eye (RE) and left eye (LE) 10.35sec with SD of +3.88 in controlled 5sec with SD + 2.3 in smokers and much reduced TBUT in heavy smokers was 3.85sec SD+2.20. Smoking has a very strong association with TRUT with a significance of P=.00 both eyes. Mean Schirmer-I value of the subjects was reported 12.6mm with SD + 8.37 in RE and 12.59mm with SD + 8.96 LE. The mean Schirmer-II test value was reported in the right, and left eye with a mean value for control was 20.23mm with SD + 8.93, 20.75mm with SD + 8.84 respectively, and in Smokers 9.90mm with SD + 5.74, and 10.07mm with SD + 6.98, and in heavy smokers 7.7mm, SD + 3.22 and 6.9, SD + 3.50 mm, association with smoking showed p=.001 in RE and .003 in LE. Smoking has deteriorated effect on both evaporative tear and aqueous tear secretion and causing symptoms of dry eye burning, itching, redness, and watering with epithelial cell damage.

Keywords: tear break-up time, basal tear secretion, smokers, dry eye

Procedia PDF Downloads 117
16923 Modelling Impacts of Global Financial Crises on Stock Volatility of Nigeria Banks

Authors: Maruf Ariyo Raheem, Patrick Oseloka Ezepue

Abstract:

This research aimed at determining most appropriate heteroskedastic model to predicting volatility of 10 major Nigerian banks: Access, United Bank for Africa (UBA), Guaranty Trust, Skye, Diamond, Fidelity, Sterling, Union, ETI and Zenith banks using daily closing stock prices of each of the banks from 2004 to 2014. The models employed include ARCH (1), GARCH (1, 1), EGARCH (1, 1) and TARCH (1, 1). The results show that all the banks returns are highly leptokurtic, significantly skewed and thus non-normal across the four periods except for Fidelity bank during financial crises; findings similar to those of other global markets. There is also strong evidence for the presence of heteroscedasticity, and that volatility persistence during crisis is higher than before the crisis across the 10 banks, with that of UBA taking the lead, about 11 times higher during the crisis. Findings further revealed that Asymmetric GARCH models became dominant especially during financial crises and post crises when the second reforms were introduced into the banking industry by the Central Bank of Nigeria (CBN). Generally, one could say that Nigerian banks returns are volatility persistent during and after the crises, and characterised by leverage effects of negative and positive shocks during these periods

Keywords: global financial crisis, leverage effect, persistence, volatility clustering

Procedia PDF Downloads 511
16922 Structural Performance of a Bridge Pier on Dubious Deep Foundation

Authors: Víctor Cecilio, Roberto Gómez, J. Alberto Escobar, Héctor Guerrero

Abstract:

The study of the structural behavior of a support/pier of an elevated viaduct in Mexico City is presented. Detection of foundation piles with uncertain integrity prompted the review of possible situations that could jeopardy the structural safety of the pier. The objective of this paper is to evaluate the structural conditions of the support, taking into account the type of anomaly reported and the depth at which it is located, the position of the pile with uncertain integrity in the foundation system, the stratigraphy of the surrounding soil and the geometry and structural characteristics of the pier. To carry out the above, dynamic analysis, spectral modal, and step-by-step, with elastic and inelastic material models, were performed. Results were evaluated in accordance with the standards used for the design of the original structural project and with the Construction Regulations for Mexico’s Federal District (RCDF-2017, 2017). Comments on the response of the analyzed models are issued, and the conclusions are presented from a structural point of view.

Keywords: dynamic analysis, inelastic models, dubious foundation, bridge pier

Procedia PDF Downloads 123
16921 Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL

Authors: S. H. Kazmi, T. Ahmed, K. Javed, A. Ghani

Abstract:

In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.

Keywords: islanding, under-frequency load shedding, frequency rate of change, static UFLS

Procedia PDF Downloads 474
16920 Improving the Students’ Writing Skill by Using Brainstorming Technique

Authors: M. Z. Abdul Rofiq Badril Rizal

Abstract:

This research is aimed to know the improvement of students’ English writing skill by using brainstorming technique. The technique used in writing is able to help the students’ difficulties in generating ideas and to lead the students to arrange the ideas well as well as to focus on the topic developed in writing. The research method used is classroom action research. The data sources of the research are an English teacher who acts as an observer and the students of class X.MIA5 consist of 35 students. The test result and observation are collected as the data in this research. Based on the research result in cycle one, the percentage of students who reach minimum accomplishment criteria (MAC) is 76.31%. It shows that the cycle must be continued to cycle two because the aim of the research has not accomplished, all of the students’ scores have not reached MAC yet. After continuing the research to cycle two and the weaknesses are improved, the process of teaching and learning runs better. At the test which is conducted in the end of learning process in cycle two, all of the students reach the minimum score and above 76 based on the minimum accomplishment criteria. It means the research has been successful and the percentage of students who reach minimum accomplishment criteria is 100%. Therefore, the writer concludes that brainstorming technique is able to improve the students’ English writing skill at the tenth grade of SMAN 2 Jember.

Keywords: brainstorming technique, improving, writing skill, knowledge and innovation engineering

Procedia PDF Downloads 355
16919 The Determinants of Country Corruption: Unobserved Heterogeneity and Individual Choice- An empirical Application with Finite Mixture Models

Authors: Alessandra Marcelletti, Giovanni Trovato

Abstract:

Corruption in public offices is found to be the reflection of country-specific features, however, the exact magnitude and the statistical significance of its determinants effect has not yet been identified. The paper aims to propose an estimation method to measure the impact of country fundamentals on corruption, showing that covariates could differently affect the extent of corruption across countries. Thus, we exploit a model able to take into account different factors affecting the incentive to ask or to be asked for a bribe, coherently with the use of the Corruption Perception Index. We assume that discordant results achieved in literature may be explained by omitted hidden factors affecting the agents' decision process. Moreover, assuming homogeneous covariates effect may lead to unreliable conclusions since the country-specific environment is not accounted for. We apply a Finite Mixture Model with concomitant variables to 129 countries from 1995 to 2006, accounting for the impact of the initial conditions in the socio-economic structure on the corruption patterns. Our findings confirm the hypothesis of the decision process of accepting or asking for a bribe varies with specific country fundamental features.

Keywords: Corruption, Finite Mixture Models, Concomitant Variables, Countries Classification

Procedia PDF Downloads 252
16918 Exchange Rate Forecasting by Econometric Models

Authors: Zahid Ahmad, Nosheen Imran, Nauman Ali, Farah Amir

Abstract:

The objective of the study is to forecast the US Dollar and Pak Rupee exchange rate by using time series models. For this purpose, daily exchange rates of US and Pakistan for the period of January 01, 2007 - June 2, 2017, are employed. The data set is divided into in sample and out of sample data set where in-sample data are used to estimate as well as forecast the models, whereas out-of-sample data set is exercised to forecast the exchange rate. The ADF test and PP test are used to make the time series stationary. To forecast the exchange rate ARIMA model and GARCH model are applied. Among the different Autoregressive Integrated Moving Average (ARIMA) models best model is selected on the basis of selection criteria. Due to the volatility clustering and ARCH effect the GARCH (1, 1) is also applied. Results of analysis showed that ARIMA (0, 1, 1 ) and GARCH (1, 1) are the most suitable models to forecast the future exchange rate. Further the GARCH (1,1) model provided the volatility with non-constant conditional variance in the exchange rate with good forecasting performance. This study is very useful for researchers, policymakers, and businesses for making decisions through accurate and timely forecasting of the exchange rate and helps them in devising their policies.

Keywords: exchange rate, ARIMA, GARCH, PAK/USD

Procedia PDF Downloads 546
16917 Hydrodynamics of Dual Hybrid Impeller of Stirred Reactor Using Radiotracer

Authors: Noraishah Othman, Siti K. Kamarudin, Norinsan K. Othman, Mohd S. Takriff, Masli I. Rosli, Engku M. Fahmi, Mior A. Khusaini

Abstract:

The present work describes hydrodynamics of mixing characteristics of two dual hybrid impeller consisting of, radial and axial impeller using radiotracer technique. Type A mixer, a Rushton turbine is mounted above a Pitched Blade Turbine (PBT) at common shaft and Type B mixer, a Rushton turbine is mounted below PBT. The objectives of this paper are to investigate the residence time distribution (RTD) of two hybrid mixers and to represent the respective mixers by RTD model. Each type of mixer will experience five radiotracer experiments using Tc99m as source of tracer and scintillation detectors NaI(Tl) are used for tracer detection. The results showed that mixer in parallel model and mixers in series with exchange can represent the flow model in mixer A whereas only mixer in parallel model can represent Type B mixer well than other models. In conclusion, Type A impeller, Rushton impeller above PBT, reduced the presence of dead zone in the mixer significantly rather than Type B.

Keywords: hybrid impeller, residence time distribution (RTD), radiotracer experiments, RTD model

Procedia PDF Downloads 342
16916 Design and Implementation of Automated Car Anti-Collision System Device Using Distance Sensor

Authors: Mehrab Masayeed Habib, Tasneem Sanjana, Ahmed Amin Rumel

Abstract:

Automated car anti-collision system is a trending technology of science. A car anti-collision system is an automobile safety system. The aim of this paper was to describe designing a car anti-collision system device to reduce the severity of an accident. The purpose of this device is to prevent collision among cars and objects to reduce the accidental death of human. This project gives an overview of secure & smooth journey of car as well as the certainty of human life. This system is controlled by microcontroller PIC. Sharp distance sensor is used to detect any object within the danger range. A crystal oscillator is used to produce the oscillation and generates the clock pulse of the microcontroller. An LCD is used to give information about the safe distance and a buzzer is used as alarm. An actuator is used as automatic break and inside the actuator; there is a motor driver that runs the actuator. For coding ‘microC PRO for PIC’ was used and ’Proteus Design Suite version 8 Software’ was used for simulation.

Keywords: sharp distance sensor, microcontroller, MicroC PRO for PIC, proteus, actuator, automobile anti-collision system

Procedia PDF Downloads 462
16915 The Mechanisms of Peer-Effects in Education: A Frame-Factor Analysis of Instruction

Authors: Pontus Backstrom

Abstract:

In the educational literature on peer effects, attention has been brought to the fact that the mechanisms creating peer effects are still to a large extent hidden in obscurity. The hypothesis in this study is that the Frame Factor Theory can be used to explain these mechanisms. At heart of the theory is the concept of “time needed” for students to learn a certain curricula unit. The relations between class-aggregated time needed and the actual time available, steers and hinders the actions possible for the teacher. Further, the theory predicts that the timing and pacing of the teachers’ instruction is governed by a “criterion steering group” (CSG), namely the pupils in the 10th-25th percentile of the aptitude distribution in class. The class composition hereby set the possibilities and limitations for instruction, creating peer effects on individual outcomes. To test if the theory can be applied to the issue of peer effects, the study employs multilevel structural equation modelling (M-SEM) on Swedish TIMSS 2015-data (Trends in International Mathematics and Science Study; students N=4090, teachers N=200). Using confirmatory factor analysis (CFA) in the SEM-framework in MPLUS, latent variables are specified according to the theory, such as “limitations of instruction” from TIMSS survey items. The results indicate a good model fit to data of the measurement model. Research is still in progress, but preliminary results from initial M-SEM-models verify a strong relation between the mean level of the CSG and the latent variable of limitations on instruction, a variable which in turn have a great impact on individual students’ test results. Further analysis is required, but so far the analysis indicates a confirmation of the predictions derived from the frame factor theory and reveals that one of the important mechanisms creating peer effects in student outcomes is the effect the class composition has upon the teachers’ instruction in class.

Keywords: compositional effects, frame factor theory, peer effects, structural equation modelling

Procedia PDF Downloads 124
16914 Generalized Up-downlink Transmission using Black-White Hole Entanglement Generated by Two-level System Circuit

Authors: Muhammad Arif Jalil, Xaythavay Luangvilay, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin

Abstract:

Black and white holes form the entangled pair⟨BH│WH⟩, where a white hole occurs when the particle moves at the same speed as light. The entangled black-white hole pair is at the center with the radian between the gap. When the speed of particle motion is slower than light, the black hole is gravitational (positive gravity), where the white hole is smaller than the black hole. On the downstream side, the entangled pair appears to have a black hole outside the gap increases until the white holes disappear, which is the emptiness paradox. On the upstream side, when moving faster than light, white holes form times tunnels, with black holes becoming smaller. It will continue to move faster and further when the black hole disappears and becomes a wormhole (Singularity) that is only a white hole in emptiness (Emptiness). This research studies use of black and white holes generated by a two-level circuit for communication transmission carriers, in which high ability and capacity of data transmission can be obtained. The black and white hole pair can be generated by the two-level system circuit when the speech of a particle on the circuit is equal to the speed of light. The black hole forms when the particle speed has increased from slower to equal to the light speed, while the white hole is established when the particle comes down faster than light. They are bound by the entangled pair, signal and idler, ⟨Signal│Idler⟩, and the virtual ones for the white hole, which has an angular displacement of half of π radian. A two-level system is made from an electronic circuit to create black and white holes bound by the entangled bits that are immune or cloning-free from thieves. Start by creating a wave-particle behavior when its speed is equal to light black hole is in the middle of the entangled pair, which is the two bit gate. The required information can be input into the system and wrapped by the black hole carrier. A timeline (Tunnel) occurs when the wave-particle speed is faster than light, from which the entangle pair is collapsed. The transmitted information is safely in the time tunnel. The required time and space can be modulated via the input for the downlink operation. The downlink is established when the particle speed is given by a frequency(energy) form is down and entered into the entangled gap, where this time the white hole is established. The information with the required destination is wrapped by the white hole and retrieved by the clients at the destination. The black and white holes are disappeared, and the information can be recovered and used.

Keywords: cloning free, time machine, teleportation, two-level system

Procedia PDF Downloads 61
16913 Computer Aided Diagnostic System for Detection and Classification of a Brain Tumor through MRI Using Level Set Based Segmentation Technique and ANN Classifier

Authors: Atanu K Samanta, Asim Ali Khan

Abstract:

Due to the acquisition of huge amounts of brain tumor magnetic resonance images (MRI) in clinics, it is very difficult for radiologists to manually interpret and segment these images within a reasonable span of time. Computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of radiologists and reduce the time required for accurate diagnosis. An intelligent computer-aided technique for automatic detection of a brain tumor through MRI is presented in this paper. The technique uses the following computational methods; the Level Set for segmentation of a brain tumor from other brain parts, extraction of features from this segmented tumor portion using gray level co-occurrence Matrix (GLCM), and the Artificial Neural Network (ANN) to classify brain tumor images according to their respective types. The entire work is carried out on 50 images having five types of brain tumor. The overall classification accuracy using this method is found to be 98% which is significantly good.

Keywords: brain tumor, computer-aided diagnostic (CAD) system, gray-level co-occurrence matrix (GLCM), tumor segmentation, level set method

Procedia PDF Downloads 495
16912 Hidden Stones When Implementing Artificial Intelligence Solutions in the Engineering, Procurement, and Construction Industry

Authors: Rimma Dzhusupova, Jan Bosch, Helena Holmström Olsson

Abstract:

Artificial Intelligence (AI) in the Engineering, Procurement, and Construction (EPC) industry has not yet a proven track record in large-scale projects. Since AI solutions for industrial applications became available only recently, deployment experience and lessons learned are still to be built up. Nevertheless, AI has become an attractive technology for organizations looking to automate repetitive tasks to reduce manual work. Meanwhile, the current AI market has started offering various solutions and services. The contribution of this research is that we explore in detail the challenges and obstacles faced in developing and deploying AI in a large-scale project in the EPC industry based on real-life use cases performed in an EPC company. Those identified challenges are not linked to a specific technology or a company's know-how and, therefore, are universal. The findings in this paper aim to provide feedback to academia to reduce the gap between research and practice experience. They also help reveal the hidden stones when implementing AI solutions in the industry.

Keywords: artificial intelligence, machine learning, deep learning, innovation, engineering, procurement and construction industry, AI in the EPC industry

Procedia PDF Downloads 106
16911 Modeling and Optimization of Algae Oil Extraction Using Response Surface Methodology

Authors: I. F. Ejim, F. L. Kamen

Abstract:

Aims: In this experiment, algae oil extraction with a combination of n-hexane and ethanol was investigated. The effects of extraction solvent concentration, extraction time and temperature on the yield and quality of oil were studied using Response Surface Methodology (RSM). Experimental Design: Optimization of algae oil extraction using Box-Behnken design was used to generate 17 experimental runs in a three-factor-three-level design where oil yield, specific gravity, acid value and saponification value were evaluated as the response. Result: In this result, a minimum oil yield of 17% and maximum of 44% was realized. The optimum values for yield, specific gravity, acid value and saponification value from the overlay plot were 40.79%, 0.8788, 0.5056 mg KOH/g and 180.78 mg KOH/g respectively with desirability of 0.801. The maximum point prediction was yield 40.79% at solvent concentration 66.68 n-hexane, temperature of 40.0°C and extraction time of 4 hrs. Analysis of Variance (ANOVA) results showed that the linear and quadratic coefficient were all significant at p<0.05. The experiment was validated and results obtained were with the predicted values. Conclusion: Algae oil extraction was successfully optimized using RSM and its quality indicated it is suitable for many industrial uses.

Keywords: algae oil, response surface methodology, optimization, Box-Bohnken, extraction

Procedia PDF Downloads 322
16910 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling

Procedia PDF Downloads 418
16909 Using Jumping Particle Swarm Optimization for Optimal Operation of Pump in Water Distribution Networks

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

Carefully scheduling the operations of pumps can be resulted to significant energy savings. Schedules can be defined either implicit, in terms of other elements of the network such as tank levels, or explicit by specifying the time during which each pump is on/off. In this study, two new explicit representations based on time-controlled triggers were analyzed, where the maximum number of pump switches was established beforehand, and the schedule may contain fewer switches than the maximum. The optimal operation of pumping stations was determined using a Jumping Particle Swarm Optimization (JPSO) algorithm to achieve the minimum energy cost. The model integrates JPSO optimizer and EPANET hydraulic network solver. The optimal pump operation schedule of VanZyl water distribution system was determined using the proposed model and compared with those from Genetic and Ant Colony algorithms. The results indicate that the proposed model utilizing the JPSP algorithm outperformed the others and is a versatile management model for the operation of real-world water distribution system.

Keywords: JPSO, operation, optimization, water distribution system

Procedia PDF Downloads 231
16908 Material Properties Evolution Affecting Demisability for Space Debris Mitigation

Authors: Chetan Mahawar, Sarath Chandran, Sridhar Panigrahi, V. P. Shaji

Abstract:

The ever-growing advancement in space exploration has led to an alarming concern for space debris removal as it restricts further launch operations and adventurous space missions; hence numerous studies have come up with technologies for re-entry predictions and material selection processes for mitigating space debris. The selection of material and operating conditions is determined with the objective of lightweight structure and ability to demise faster subject to spacecraft survivability during its mission. Since the demisability of spacecraft depends on evolving thermal material properties such as emissivity, specific heat capacity, thermal conductivity, radiation intensity, etc. Therefore, this paper presents the analysis of evolving thermal material properties of spacecraft, which affect the demisability process and thus estimate demise time using the demisability model by incorporating evolving thermal properties for sensible heating followed by the complete or partial break-up of spacecraft. The demisability analysis thus concludes the best suitable spacecraft material is based on the least estimated demise time, which fulfills the criteria of design-for-survivability and as well as of design-for-demisability.

Keywords: demisability, emissivity, lightweight, re-entry, survivability

Procedia PDF Downloads 102
16907 Short Review on Models to Estimate the Risk in the Financial Area

Authors: Tiberiu Socaciu, Tudor Colomeischi, Eugenia Iancu

Abstract:

Business failure affects in various proportions shareholders, managers, lenders (banks), suppliers, customers, the financial community, government and society as a whole. In the era in which we have telecommunications networks, exists an interdependence of markets, the effect of a failure of a company is relatively instant. To effectively manage risk exposure is thus require sophisticated support systems, supported by analytical tools to measure, monitor, manage and control operational risks that may arise. As we know, bankruptcy is a phenomenon that managers do not want no matter what stage of life is the company they direct / lead. In the analysis made by us, by the nature of economic models that are reviewed (Altman, Conan-Holder etc.), estimating the risk of bankruptcy of a company corresponds to some extent with its own business cycle tracing of the company. Various models for predicting bankruptcy take into account direct / indirect aspects such as market position, company growth trend, competition structure, characteristics and customer retention, organization and distribution, location etc. From the perspective of our research we will now review the economic models known in theory and practice for estimating the risk of bankruptcy; such models are based on indicators drawn from major accounting firms.

Keywords: Anglo-Saxon models, continental models, national models, statistical models

Procedia PDF Downloads 393
16906 Evaluation of Engineering Cementitious Composites (ECC) with Different Percentage of Fibers

Authors: Bhaumik Merchant, Ajay Gelot

Abstract:

Concrete is good in compression but if any type of strain applied to it, it starts to fail. Where the steel is good tension, it can bear the deflection up to its elastic limits. This project is based on behavior of engineered cementitious composited (ECC) when it is replaced with the different amount of Polyvinyl Alcohol (PVA) Fibers. As for research, PVA fibers is used with cementitious up to 2% to evaluate the optimum amount of fiber on which we can find the maximum compressive, tensile and flexural strength. PVA is basically an adhesive which is used to formulate glue. Generally due to excessive loading, cracks develops which concludes to successive damage to the structural component. In research plasticizer is used to increase workability. With the help of optimum amount of PVA fibers, it can limit the crack widths up to 60µm to 100µm. Also can be used to reduce resources and funds for rehabilitation of structure. At the starting this fiber concrete can be double the cost as compare to conventional concrete but as it can amplify the duration of structure, it will be less costlier than the conventional concrete.

Keywords: compressive strength, engineered cementitious composites, flexural strength, polyvinyl alcohol fibers, rehabilitation of structures

Procedia PDF Downloads 276
16905 Evaluating the Total Costs of a Ransomware-Resilient Architecture for Healthcare Systems

Authors: Sreejith Gopinath, Aspen Olmsted

Abstract:

This paper is based on our previous work that proposed a risk-transference-based architecture for healthcare systems to store sensitive data outside the system boundary, rendering the system unattractive to would-be bad actors. This architecture also allows a compromised system to be abandoned and a new system instance spun up in place to ensure business continuity without paying a ransom or engaging with a bad actor. This paper delves into the details of various attacks we simulated against the prototype system. In the paper, we discuss at length the time and computational costs associated with storing and retrieving data in the prototype system, abandoning a compromised system, and setting up a new instance with existing data. Lastly, we simulate some analytical workloads over the data stored in our specialized data storage system and discuss the time and computational costs associated with running analytics over data in a specialized storage system outside the system boundary. In summary, this paper discusses the total costs of data storage, access, and analytics incurred with the proposed architecture.

Keywords: cybersecurity, healthcare, ransomware, resilience, risk transference

Procedia PDF Downloads 126
16904 An Engineer-Oriented Life Cycle Assessment Tool for Building Carbon Footprint: The Building Carbon Footprint Evaluation System in Taiwan

Authors: Hsien-Te Lin

Abstract:

The purpose of this paper is to introduce the BCFES (building carbon footprint evaluation system), which is a LCA (life cycle assessment) tool developed by the Low Carbon Building Alliance (LCBA) in Taiwan. A qualified BCFES for the building industry should fulfill the function of evaluating carbon footprint throughout all stages in the life cycle of building projects, including the production, transportation and manufacturing of materials, construction, daily energy usage, renovation and demolition. However, many existing BCFESs are too complicated and not very designer-friendly, creating obstacles in the implementation of carbon reduction policies. One of the greatest obstacle is the misapplication of the carbon footprint inventory standards of PAS2050 or ISO14067, which are designed for mass-produced goods rather than building projects. When these product-oriented rules are applied to building projects, one must compute a tremendous amount of data for raw materials and the transportation of construction equipment throughout the construction period based on purchasing lists and construction logs. This verification method is very cumbersome by nature and unhelpful to the promotion of low carbon design. With a view to provide an engineer-oriented BCFE with pre-diagnosis functions, a component input/output (I/O) database system and a scenario simulation method for building energy are proposed herein. Most existing BCFESs base their calculations on a product-oriented carbon database for raw materials like cement, steel, glass, and wood. However, data on raw materials is meaningless for the purpose of encouraging carbon reduction design without a feedback mechanism, because an engineering project is not designed based on raw materials but rather on building components, such as flooring, walls, roofs, ceilings, roads or cabinets. The LCBA Database has been composited from existing carbon footprint databases for raw materials and architectural graphic standards. Project designers can now use the LCBA Database to conduct low carbon design in a much more simple and efficient way. Daily energy usage throughout a building's life cycle, including air conditioning, lighting, and electric equipment, is very difficult for the building designer to predict. A good BCFES should provide a simplified and designer-friendly method to overcome this obstacle in predicting energy consumption. In this paper, the author has developed a simplified tool, the dynamic Energy Use Intensity (EUI) method, to accurately predict energy usage with simple multiplications and additions using EUI data and the designed efficiency levels for the building envelope, AC, lighting and electrical equipment. Remarkably simple to use, it can help designers pre-diagnose hotspots in building carbon footprint and further enhance low carbon designs. The BCFES-LCBA offers the advantages of an engineer-friendly component I/O database, simplified energy prediction methods, pre-diagnosis of carbon hotspots and sensitivity to good low carbon designs, making it an increasingly popular carbon management tool in Taiwan. To date, about thirty projects have been awarded BCFES-LCBA certification and the assessment has become mandatory in some cities.

Keywords: building carbon footprint, life cycle assessment, energy use intensity, building energy

Procedia PDF Downloads 129
16903 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 40
16902 Surface Erosion and Slope Stability Assessment of Cut and Fill Slope

Authors: Kongrat Nokkaew

Abstract:

This article assessed the surface erosion and stability of cut and fill slope in the excavation of the detention basin, Kalasin Province, Thailand. The large excavation project was built to enlarge detention basin for relieving repeated flooding and drought which usually happen in this area. However, at the end of the 1st rainstorm season, severely erosions slope failures were widespread observed. After investigation, the severity of erosions and slope failure were classified into five level from sheet erosion (Level 1), rill erosion (Level 2, 3), gully erosion (Level 4), and slope failure (Level 5) for proposing slope remediation. The preliminary investigation showed that lack of runoff control were the major factors of the surface erosions while insufficient compacted of the fill slope leaded to slopes failures. The slope stability of four selected slope failure was back calculated by using Simplified Bishop with Seep-W. The result show that factor of safety of slope located on non-plasticity sand was less than one, representing instability of the embankment slope. Such analysis agreed well with the failures observed in the field.

Keywords: surface erosion, slope stability, detention basin, cut and fill

Procedia PDF Downloads 349
16901 Protein Derived Biodegradable Food Packaging Material from Poultry By-Product

Authors: Muhammad Zubair, Aman Ullah, Jianping Wu

Abstract:

During the last decades, petroleum derived synthetic polymers like polyethylene terephthalate, polyvinylchloride, polyethylene, polypropylene and polystyrene has extensively been used in the field of food packaging and mostly are non-degradable. Biopolymers are a good fit for single-use or short-lived products such as food packaging. Spent hens, a poultry by-product which is of little economic value and their disposal are environmentally harmful. Through current study, we have explored the possibility to transform proteins from spent fowl into green food packaging material. Proteins from spent fowl were extracted within 1 hour using pH shift method with recovery of about 74%. Different plasticizers were tried like glycerol, sorbitol, glutaraldehyde, 1,2 ethylene glycol and 1,2 butanediol. Glycerol was the best plasticizer among all these plasticizers. A naturally occurring and non-toxic cross-linking agent, chitosan, was used to form the chitosan/glycerol/protein blend by casting and compression molding techniques. The mechanical properties were characterized using tensile strength analyzer. The nano-reinforcements with homogeneous dispersion of nanoparticles lead to improved physical properties suggesting that these materials have great potential for food packaging applications.

Keywords: differential scanning calorimetry, dynamic mechanical analysis, scanning electron microscopy, spent hen

Procedia PDF Downloads 267
16900 Improvements in Transient Testing in The Transient REActor Test (TREAT) with a Choice of Filter

Authors: Harish Aryal

Abstract:

The safe and reliable operation of nuclear reactors has always been one of the topmost priorities in the nuclear industry. Transient testing allows us to understand the time-dependent behavior of the neutron population in response to either a planned change in the reactor conditions or unplanned circumstances. These unforeseen conditions might occur due to sudden reactivity insertions, feedback, power excursions, instabilities, and accidents. To study such behavior, we need transient testing, which is like car crash testing, to estimate the durability and strength of a car design. In nuclear designs, such transient testing can simulate a wide range of accidents due to sudden reactivity insertions and helps to study the feasibility and integrity of the fuel to be used in certain reactor types. This testing involves a high neutron flux environment and real-time imaging technology with advanced instrumentation with appropriate accuracy and resolution to study the fuel slumping behavior. With the aid of transient testing and adequate imaging tools, it is possible to test the safety basis for reactor and fuel designs that serves as a gateway in licensing advanced reactors in the future. To that end, it is crucial to fully understand advanced imaging techniques both analytically and via simulations. This paper presents an innovative method of supporting real-time imaging of fuel pins and other structures during transient testing. The major fuel-motion detection device that is studied in this dissertation is the Hodoscope which requires collimators. This paper provides 1) an MCNP model and simulation of a Transient Reactor Test (TREAT) core with a central fuel element replaced by a slotted fuel element that provides an open path between test samples and a hodoscope detector and 2) a choice of good filter to improve image resolution.

Keywords: hodoscope, transient testing, collimators, MCNP, TREAT, hodogram, filters

Procedia PDF Downloads 60
16899 Comparison Methyl Orange and Malachite Green Dyes Removal by GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH as Adsorbents

Authors: Omid Moradi, Mostafa Rajabi

Abstract:

Graphene oxide (GO), reduced graphene oxide (rGO), multi-walled carbon nanotubes MWCNT), multi-walled carbon nanotube functionalized carboxyl (MWCNT-COOH), and multi-walled carbon nanotube functionalized thiol (MWCNT-SH) were used as efficient adsorbents for the rapid removal two dyes methyl orange (MO) and malachite green (MG) from the aqueous phase. The impact of several influential parameters such as initial dye concentrations, contact time, temperature, and initial solution pH was well studied and optimized. The optimize time for adsorption process of methyl orange dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were determined at 100, 100, 60, 25, and 60 min, respectively and The optimize time for adsorption process of malachite green dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were determined at 100, 100, 60, 15, and 60 min, respectively. The maximum removal efficiency for methyl orange dye by GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were occurred at optimized pH 3, 3, 6, 2, and 6 of aqueous solutions, respectively and for malachite green dye were occurred at optimized pH 3, 3, 6, 9, and 6 of aqueous solutions, respectively. The effect of temperature showed that adsorption process of malachite green dye on GO, rGO, MWCNT, and MWCNT-SH surfaces were endothermic and for adsorption process of methyl orange dye on GO, rGO, MWCNT, and MWCNT-SH surfaces were endothermic but while adsorption of methyl orange and malachite green dyes on MWCNT-COOH surface were exothermic.On increasing the initial concentration of methyl orange dye adsorption capacity on GO surface was decreased and on rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were increased and with increasing the initial concentration of malachite green dye on GO, rGO, MWCNT, MWCNT-COOH, and MWCNT-SH surfaces were increased.

Keywords: adsorption, graphene oxide, reduced graphene oxide, multi-walled carbon nanotubes, methyl orange, malachite green, removal

Procedia PDF Downloads 366
16898 A Long-Standing Methodology Quest Regarding Commentary of the Qur’an: Modern Debates on Function of Hermeneutics in the Quran Scholarship in Turkey

Authors: Merve Palanci

Abstract:

This paper aims to reveal and analyze methodology debates on Qur’an Commentary in Turkish Scholarship and to make sound inductions on the current situation, with reference to the literature evolving around the credibility of Hermeneutics when the case is Qur’an commentary and methodological connotations related to it, together with the other modern approaches to the Qur’an. It is fair to say that Tafseer, constituting one of the main parts of basic Islamic sciences, has drawn great attention from both Muslim and non-Muslim scholars for a long time. And with the emplacement of an acute junction between natural sciences and social sciences in the post-enlightenment period, this interest seems to pave the way for methodology discussions that are conducted by theology spheres, occupying a noticeable slot in Tafseer literature, as well. A panoramic glance at the classical treatise in relation to the methodology of Tafseer, namely Usul al-Tafseer, leads the reader to the conclusion that these classics are intrinsically aimed at introducing the Qur’an and its early history of formation as a corpus and providing a better understanding of its content. To illustrate, the earliest methodology work extant for Qur’an commentary, al- Aql wa’l Fahm al- Qur’an by Harith al-Muhasibi covers content that deals with Qur’an’s rhetoric, its muhkam and mutashabih, and abrogation, etc. And most of the themes in question are evident to share a common ground: understanding the Scripture and producing an accurate commentary to be built on this preliminary phenomenon of understanding. The content of other renowned works in an overtone of Tafseer methodology, such as Funun al Afnan, al- Iqsir fi Ilm al- Tafseer, and other succeeding ones al- Itqan and al- Burhan is also rich in hints related to preliminary phenomena of understanding. However, these works are not eligible for being classified as full-fledged methodology manuals assuring a true understanding of the Qur’an. And Hermeneutics is believed to supply substantial data applicable to Qur’an commentary as it deals with the nature of understanding itself. Referring to the latest tendencies in Tafseer methodology, this paper envisages to centralize hermeneutical debates in modern scholarship of Qur’an commentary and the incentives that lead scholars to apply for Hermeneutics in Tafseer literature. Inspired from these incentives, the study involves three parts. In the introduction part, this paper introduces key features of classical methodology works in general terms and traces back the main methodological shifts of modern times in Qur’an commentary. To this end, revisionist Ecole, scientific Qur’an commentary ventures, and thematic Qur’an commentary are included and analysed briefly. However, historical-critical commentary on the Quran, as it bears a close relationship with hermeneutics, is handled predominantly. The second part is based on the hermeneutical nature of understanding the Scripture, revealing a timeline for the beginning of hermeneutics debates in Tafseer, and Fazlur Rahman’s(d.1988) influence will be manifested for establishing a theoretical bridge. In the following part, reactions against the application of Hermeneutics in Tafseer activity and pro-hermeneutics works will be revealed through cross-references to the prominent figures of both, and the literature in question in theology scholarship in Turkey will be explored critically.

Keywords: hermeneutics, Tafseer, methodology, Ulum al- Qur’an, modernity

Procedia PDF Downloads 62
16897 Design Optimization of a Compact Quadrupole Electromagnet for CLS 2.0

Authors: Md. Armin Islam, Les Dallin, Mark Boland, W. J. Zhang

Abstract:

This paper reports a study on the optimal magnetic design of a compact quadrupole electromagnet for the Canadian Light Source (CLS 2.0). The nature of the design is to determine a quadrupole with low relative higher order harmonics and better field quality. The design problem was formulated as an optimization model, in which the objective function is the higher order harmonics (multipole errors) and the variable to be optimized is the material distribution on the pole. The higher order harmonics arose in the quadrupole due to truncating the ideal hyperbola at a certain point to make the pole. In this project, the arisen harmonics have been optimized both transversely and longitudinally by adjusting material on the poles in a controlled way. For optimization, finite element analysis (FEA) has been conducted. A better higher order harmonics amplitudes and field quality have been achieved through the optimization. On the basis of the optimized magnetic design, electrical and cooling calculation has been performed for the magnet.

Keywords: drift, electrical, and cooling calculation, integrated field, magnetic field gradient, multipole errors, quadrupole

Procedia PDF Downloads 134
16896 The Effect of Connections Form on Seismic Behavior of Portal Frames

Authors: Kiavash Heidarzadeh

Abstract:

The seismic behavior of portal frames is mainly based on the shape of their joints. In these structures, vertical and inclined connections are the two general forms of connections. The shapes of connections can make differences in seismic responses of portal frames. Hence, in this paper, for the first step, the non-linear performance of portal frames with vertical and inclined connections has been investigated by monotonic analysis. Also, the effect of section sizes is considered in this analysis. For comparison, hysteresis curves have been evaluated for two model frames with different forms of connections. Each model has three various sizes of the column and beam. Other geometrical parameters have been considered constant. In the second step, for every model, an appropriate size of sections has been selected from the previous step. Next, the seismic behavior of each model has been analyzed by the time history method under three near-fault earthquake records. Finite element ABAQUS software is used for simulation and analysis of samples. Outputs show that connections form can impact on reaction forces of portal frames under earthquake loads. Also, it is understood that the load capacity in frames with vertical connections is more than the frames with inclined connections.

Keywords: inclined connections, monotonic, portal frames, seismic behavior, time history, vertical connections

Procedia PDF Downloads 216