Search results for: CHIC Analysis V 1.1 Software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30488

Search results for: CHIC Analysis V 1.1 Software

25658 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, construction projects, cost estimation, NRM, ontology

Procedia PDF Downloads 551
25657 E-teaching Barriers: A Survey from Shanghai Primary School Teachers

Authors: Liu Dan

Abstract:

It was considered either unnecessary or impossible for primary school students to implement online teaching until last year. A large number of E-learning or E-teaching researches have been focused on adult-learners, andragogy and technology, however, primary school education, it is facing many problems that need to be solved. Therefore, this research is aimed at exploring barriers and influential factors on online teaching for K-12 students from teachers’ perspectives and discussing the E-pedagogy that is suitable for primary school students and teachers. Eight hundred and ninety-six teachers from 10 primary schools in Shanghai were invited to participate in a questionnaire survey. Data were analysed by hierarchical regression, and the results stress the significant three barriers by teachers with online teaching: the existing system is deficient in emotional interaction, teachers’ attitude towards the technology is negative and the present teacher training is lack of systematic E-pedagogy guidance. The barriers discovered by this study will help the software designers (E-lab) develop tools that allow for flexible and evolving pedagogical approaches whilst providing an easy entry point for cautious newcomers, so that help the teachers free to engage in E-teaching at pedagogical and disciplinary levels, to enhance their repertoire of teaching practices.

Keywords: online teaching barriers (OTB), e-teaching, primary school, teachers, technology

Procedia PDF Downloads 201
25656 Analyses and Optimization of Physical and Mechanical Properties of Direct Recycled Aluminium Alloy (AA6061) Wastes by ANOVA Approach

Authors: Mohammed H. Rady, Mohd Sukri Mustapa, S Shamsudin, M. A. Lajis, A. Wagiman

Abstract:

The present study is aimed at investigating microhardness and density of aluminium alloy chips when subjected to various settings of preheating temperature and preheating time. Three values of preheating temperature were taken as 450 °C, 500 °C, and 550 °C. On the other hand, three values of preheating time were chosen (1, 2, 3) hours. The influences of the process parameters (preheating temperature and time) were analyzed using Design of Experiments (DOE) approach whereby full factorial design with center point analysis was adopted. The total runs were 11 and they comprise of two factors of full factorial design with 3 center points. The responses were microhardness and density. The results showed that the density and microhardness increased with decreasing the preheating temperature. The results also found that the preheating temperature is more important to be controlled rather than the preheating time in microhardness analysis while both the preheating temperature and preheating time are important in density analysis. It can be concluded that setting temperature at 450 °C for 1 hour resulted in the optimum responses.

Keywords: AA6061, density, DOE, hot extrusion, microhardness

Procedia PDF Downloads 349
25655 Lateral Buckling of Nanoparticle Additive Composite Beams

Authors: Gürkan Şakar, Akgün Alsaran, Emrah E. Özbaldan

Abstract:

In this study, lateral buckling analysis of composite beams with particle additive was carried out experimentally and numerically. The effects of particle type, particle addition ratio on buckling loads of composite beams were determined. The numerical studies were performed with ANSYS package. In the analyses, clamped-free boundary condition was assumed. The load carrying capabilities of composite beams were influenced by different particle types and particle addition ratios.

Keywords: lateral buckling, nanoparticle, composite beam, numeric analysis

Procedia PDF Downloads 475
25654 Introducing Thermodynamic Variables through Scientific Inquiry for Engineering Students

Authors: Paola Utreras, Yazmina Olmos, Loreto Sanhueza

Abstract:

This work shows how the learning of physics is enriched with scientific inquiry practices, achieving learning that results in the use of higher-level cognitive skills. The activities, which were carried out with students of the 3rd semester of the courses of the Faculty of Sciences of the Engineering of the Austral University of Chile, focused on the understanding of the nature of the thermodynamic variables and how they relate to each other. This, through the analysis of atmospheric data obtained in the meteorological station Miraflores, located on the campus. The proposed activities consisted of the elaboration of time series, linear analysis of variables, as well as the analysis of frequencies and periods. From their results, the students reached conclusions associated with the nature of the thermodynamic variables studied and the relationships between them, to finally make public their results in a report using scientific writing standards. It is observed that introducing topics that are close to them, interesting and which affect their daily lives allows a better understanding of the subjects, which is reflected in higher levels of approval and motivation for the subject.

Keywords: basic sciences, inquiry-based learning, scientific inquiry, thermodynamics

Procedia PDF Downloads 258
25653 Comparative Analysis of Medical Tourism Industry among Key Nations in Southeast Asia

Authors: Nur A. Azmi, Suseela D. Chandran, Fadilah Puteh, Azizan Zainuddin

Abstract:

Medical tourism has been associated as a global phenomenon in developed and developing countries in the 21st century. Medical tourism is defined as an activity in which individuals who travel from one country to another country to seek or receive medical healthcare. Based on the global trend, the number of medical tourists is increasing annually, especially in the Southeast Asia (SEA) region. Since the establishment of Association of Southeast Asian Nations (ASEAN) in 1967, the SEA nations have worked towards regional integration in medical tourism. The medical tourism in the SEA has become the third-largest sector that contributes towards economic development. Previous research has demonstrated several factors that affect the development of medical tourism. However, despite the already published literature on SEA's medical tourism in the last ten years there continues to be a scarcity of research on niche areas each of the SEA countries. Hence, this paper is significant in enriching the literature in the field of medical tourism particularly in showcasing the niche market of medical tourism among the SEA best players namely Singapore, Thailand, Malaysia and Indonesia. This paper also contributes in offering a comparative analysis between the said nations whether they are complementing or competing with each other in the medical tourism sector. This then, will increase the availability of information in SEA region on medical tourism. The data was collected through an in-depth interview with various stakeholders and private hospitals. The data was then analyzed using two approaches namely thematic analysis (interview data) and document analysis (secondary data). The paper concludes by arguing that the ASEAN countries have specific niche market to promote their medical tourism industry. This paper also concludes that these key nations complement each other in the industry. In addition, the medical tourism sector in SEA region offers greater prospects for market development and expansion that witnessed the emerging of new key players from other nations.

Keywords: healthcare services, medical tourism, medical tourists, SEA region, comparative analysis

Procedia PDF Downloads 144
25652 Determination of the Phytochemicals Composition and Pharmacokinetics of whole Coffee Fruit Caffeine Extract by Liquid Chromatography-Tandem Mass Spectrometry

Authors: Boris Nemzer, Nebiyu Abshiru, Z. B. Pietrzkowski

Abstract:

Coffee cherry is one of the most ubiquitous agricultural commodities which possess nutritional and human health beneficial properties. Between the two most widely used coffee cherries Coffea arabica (Arabica) and Coffea canephora (Robusta), Coffea arabica remains superior due to its sensory properties and, therefore, remains in great demand in the global coffee market. In this study, the phytochemical contents and pharmacokinetics of Coffeeberry® Energy (CBE), a commercially available Arabica whole coffee fruit caffeine extract, are investigated. For phytochemical screening, 20 mg of CBE was dissolved in an aqueous methanol solution for analysis by mass spectrometry (MS). Quantification of caffeine and chlorogenic acids (CGAs) contents of CBE was performed using HPLC. For the bioavailability study, serum samples were collected from human subjects before and after 1, 2 and 3 h post-ingestion of 150mg CBE extract. Protein precipitation and extraction were carried out using methanol. Identification of compounds was performed using an untargeted metabolomic approach on Q-Exactive Orbitrap MS coupled to reversed-phase chromatography. Data processing was performed using Thermo Scientific Compound Discover 3.3 software. Phytochemical screening identified a total of 170 compounds, including organic acids, phenolic acids, CGAs, diterpenoids and hydroxytryptamine. Caffeine & CGAs make up more than, respectively, 70% & 9% of the total CBE composition. For serum samples, a total of 82 metabolites representing 32 caffeine- and 50 phenolic-derived metabolites were identified. Volcano plot analysis revealed 32 differential metabolites (24 caffeine- and 8 phenolic-derived) that showed an increase in serum level post-CBE dosing. Caffeine, uric acid, and trimethyluric acid isomers exhibited 4- to 10-fold increase in serum abundance post-dosing. 7-Methyluric acid, 1,7-dimethyluric acid, paraxanthine and theophylline exhibited a minimum of 1.5-fold increase in serum level. Among the phenolic-derived metabolites, iso-feruloyl quinic acid isomers (3-, 4- and 5-iFQA) showed the highest increase in serum level. These compounds were essentially absent in serum collected before dosage. More interestingly, the iFQA isomers were not originally present in the CBE extract, as our phytochemical screen did not identify these compounds. This suggests the potential formation of the isomers during the digestion and absorption processes. Pharmacokinetics parameters (Cmax, Tmax and AUC0-3h) of caffeine- and phenolic-derived metabolites were also investigated. Caffeine was rapidly absorbed, reaching a maximum concentration (Cmax) of 10.95 µg/ml in just 1 hour. Thereafter, caffeine level steadily dropped from the peak level, although it did not return to baseline within the 3-hour dosing period. The disappearance of caffeine from circulation was mirrored by the rise in the concentration of its methylxanthine metabolites. Similarly, serum concentration of iFQA isomers steadily increased, reaching maximum (Cmax: 3-iFQA, 1.54 ng/ml; 4-iFQA, 2.47 ng/ml; 5-iFQA, 2.91 ng/ml) at tmax of 1.5 hours. The isomers remained well above the baseline during the 3-hour dosing period, allowing them to remain in circulation long enough for absorption into the body. Overall, the current study provides evidence of the potential health benefits of a uniquely formulated whole coffee fruit product. Consumption of this product resulted in a distinct serum profile of bioactive compounds, as demonstrated by the more than 32 metabolites that exhibited a significant change in systemic exposure.

Keywords: phytochemicals, mass spectrometry, pharmacokinetics, differential metabolites, chlorogenic acids

Procedia PDF Downloads 69
25651 Analysis of Microstructure around Opak River Pleret Area, Bantul Regency, Special Region of Yogyakarta Province, Indonesia, as a Result of Opak Fault Reactivation, Using Stereographic Method

Authors: Gayus Pratama Polunggu, Pamela Felita Adibrata, Hafidh Fathur Riza

Abstract:

Opak Fault is a large fault that extends from the northeast to the southwest of Yogyakarta Special Region. Opak Fault allegedly re-active after the 2006 Yogyakarta earthquake, about eleven years ago. Opak Fault is a big fault, therefore the activation will bring up the microstructure around the Opak River. This microstructure will reveal a different direction of force from the Opak Fault because the trigger for the emergence of the microstructure is the reactivation of the Opak Fault. In other words, this microstructure is a potentially severe weak area during a tectonic disaster. This research was conducted to find out the impact from the reactivation of Opak Fault that triggered the emergence of microstructure around Opak River which is very useful for disaster mitigation information around research area. This research used the approach from literature study in the form of the journal of structural geology and field study. The method used is a laboratory analysis in the form of stereographic analysis.

Keywords: Opak fault, reactivation, microstructure, stereographic

Procedia PDF Downloads 184
25650 Qualitative Evaluation of the Morris Collection Conservation Project at the Sainsbury Centre of Visual Arts in the Context of Agile, Lean and Hybrid Project Management Approaches

Authors: Maria Ledinskaya

Abstract:

This paper examines the Morris Collection Conservation Project at the Sainsbury Centre for Visual Arts in the context of Agile, Lean, and Hybrid project management. It is part case study and part literature review. To date, relatively little has been written about non-traditional project management approaches in heritage conservation. This paper seeks to introduce Agile, Lean, and Hybrid project management concepts from business, software development, and manufacturing fields to museum conservation, by referencing their practical application on a recent museum-based conservation project. The Morris Collection Conservation Project was carried out in 2019-2021 in Norwich, UK, and concerned the remedial conservation of around 150 Abstract Constructivist artworks bequeathed to the Sainsbury Centre for Visual Arts by private collectors Michael and Joyce Morris. The first part introduces the chronological timeline and key elements of the project. It describes a medium-size conservation project of moderate complexity, which was planned and delivered in an environment with multiple known unknowns – unresearched collection, unknown condition and materials, unconfirmed budget. The project was also impacted by the unknown unknowns of the COVID-19 pandemic, such as indeterminate lockdowns, and the need to accommodate social distancing and remote communications. The author, a staff conservator at the Sainsbury Centre who acted as project manager on the Morris Collection Conservation Project, presents an incremental, iterative, and value-based approach to managing a conservation project in an uncertain environment. Subsequent sections examine the project from the point of view of Traditional, Agile, Lean, and Hybrid project management. The author argues that most academic writing on project management in conservation has focussed on a Traditional plan-driven approach – also known as Waterfall project management – which has significant drawbacks in today’s museum environment, due to its over-reliance on prediction-based planning and its low tolerance to change. In the last 20 years, alternative Agile, Lean and Hybrid approaches to project management have been widely adopted in software development, manufacturing, and other industries, although their recognition in the museum sector has been slow. Using examples from the Morris Collection Conservation Project, the author introduces key principles and tools of Agile, Lean, and Hybrid project management and presents a series of arguments on the effectiveness of these alternative methodologies in museum conservation, as well as the ethical and practical challenges to their implementation. These project management approaches are discussed in the context of consequentialist, relativist, and utilitarian developments in contemporary conservation ethics, particularly with respect to change management, bespoke ethics, shared decision-making, and value-based cost-benefit conservation strategy. The author concludes that the Morris Collection Conservation Project had multiple Agile and Lean features which were instrumental to the successful delivery of the project. These key features are identified as distributed decision making, a co-located cross-disciplinary team, servant leadership, focus on value-added work, flexible planning done in shorter sprint cycles, light documentation, and emphasis on reducing procedural, financial, and logistical waste. Overall, the author’s findings point largely in favour of a Hybrid model which combines traditional and alternative project processes and tools to suit the specific needs of the project.

Keywords: project management, conservation, waterfall, agile, lean, hybrid

Procedia PDF Downloads 99
25649 Polymerase Chain Reaction Analysis and Random Amplified Polymorphic DNA of Agrobacterium Tumefaciens

Authors: Abeer M. Algeblawi

Abstract:

Fifteen isolates of Agrobacterium tumefaciens were obtained from crown gall samples collected from six locations (Tripoli, Alzahra, Ain-Zara, Alzawia, Alazezia in Libya) from Grape (Vitis vinifera L.), Pear (Pyrus communis L.), Peach (Prunus persica L.) and Alexandria in Egypt from Guava (Psidium guajava L.) trees, Artichoke (Cynara cardunculus L.) and Sugar beet (Beta vulgaris L.). Total DNA was extracted from the eight isolates as well as the identification of six isolates used into Polymerase Chain Reaction (PCR) analysis and Random Amplified Polymorphic DNA (RAPD) technique were used. High similarity (55.5%) was observed among the eight A. tumefaciens isolates (Agro1, Agro2, Agro3, Agro4, Agro5, Agro6, Agro7, and Agro8). The PCR amplification products were resulting from the use of two specific primers (virD2A-virD2C). Analysis induction six isolates of A. tumefaciens obtained from different hosts. A visible band was specific to A. tumefaciens of (220 bp, 224 bp) and 338 bp produced with total DNA extracted from bacterial cells.

Keywords: Agrobacterium tumefaciens, crown gall, identification, molecular characterization, PCR, RAPD

Procedia PDF Downloads 144
25648 Design, Control and Autonomous Trajectory Tracking of an Octorotor Rotorcraft

Authors: Seyed Jamal Haddadi, M. Reza Mehranpour, Roya Sadat Mortazavi, Zahra Sadat Mortazavi

Abstract:

Principal aim of this research is trajectory tracking, attitude and position control scheme in real flight mode by an Octorotor helicopter. For more stability, in this Unmanned Aerial Vehicle (UAV), number of motors is increased to eight motors which end of each arm installed two coaxial counter rotating motors. Dynamic model of this Octorotor includes of motion equation for translation and rotation. Utilized controller is proportional-integral-derivative (PID) control loop. The proposed controller is designed such that to be able to attenuate an effect of external wind disturbance and guarantee stability in this condition. The trajectory is determined by a Global Positioning System (GPS). Also an ARM CortexM4 is used as microprocessor. Electronic board of this UAV designed as able to records all of the sensors data, similar to an aircraft black box in external memory. Finally after auto landing of Octorotor, flight data is shown in MATLAB software and Experimental results of the proposed controller show the effectiveness of our approach on the Autonomous Quadrotor in real conditions.

Keywords: octorotor, design, PID controller, autonomous, trajectory tracking

Procedia PDF Downloads 304
25647 Characterization of Electrospun Carbon Nanofiber Doped Polymer Composites

Authors: Atilla Evcin, Bahri Ersoy, Süleyman Akpınar, I. Sinan Atlı

Abstract:

Ceramic, polymer and composite nanofibers are nowadays begun to be utilized in many fields of nanotechnology. By the means of dimensions, these fibers are as small as nano scale but because of having large surface area and microstructural characteristics, they provide unique mechanic, optical, magnetic, electronic and chemical properties. In terms of nanofiber production, electrospinning has been the most widely used technique in recent years. In this study, carbon nanofibers have been synthesized from solutions of Polyacrylonitrile (PAN)/ N,N-dimethylformamide (DMF) by electrospinning method. The carbon nanofibers have been stabilized by oxidation at 250 °C for 2 h in air and carbonized at 750 °C for 1 h in H2/N2. Images of carbon nanofibers have been taken with scanning electron microscopy (SEM). The images have been analyzed to study the fiber morphology and to determine the distribution of the fiber diameter using FibraQuant 1.3 software. Then polymer composites have been produced from mixture of carbon nanofibers and silicone polymer. The final polymer composites have been characterized by X-ray diffraction method and scanning electron microscopy (SEM) energy dispersive X-ray (EDX) measurements. These results have been reported and discussed. At result, homogeneous carbon nanofibers with 100-167 nm of diameter were obtained with optimized electrospinning conditions.

Keywords: electrospinning, characterization, composites, nanofiber

Procedia PDF Downloads 394
25646 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis

Authors: R. Periyasamy, Deepak Joshi, Sneh Anand

Abstract:

Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.

Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis

Procedia PDF Downloads 499
25645 The Model Establishment and Analysis of TRACE/MELCOR for Kuosheng Nuclear Power Plant Spent Fuel Pool

Authors: W. S. Hsu, Y. Chiang, Y. S. Tseng, J. R. Wang, C. Shih, S. W. Chen

Abstract:

Kuosheng nuclear power plant (NPP) is a BWR/6 plant in Taiwan. There is more concern for the safety of NPPs in Taiwan after Japan Fukushima NPP disaster occurred. Hence, in order to estimate the safety of Kuosheng NPP spent fuel pool (SFP), by using TRACE, MELCOR, and SNAP codes, the safety analysis of Kuosheng NPP SFP was performed. There were two main steps in this research. First, the Kuosheng NPP SFP models were established. Second, the transient analysis of Kuosheng SFP was done by TRACE and MELCOR under the cooling system failure condition (Fukushima-like condition). The results showed that the calculations of MELCOR and TRACE were very similar in this case, and the fuel uncover happened roughly at 4th day after the failure of cooling system. The above results indicated that Kuosheng NPP SFP may be unsafe in the case of long-term SBO situation. In addition, future calculations were needed to be done by the other codes like FRAPTRAN for the cladding calculations.

Keywords: TRACE, MELCOR, SNAP, spent fuel pool

Procedia PDF Downloads 331
25644 Existential Feeling in Contemporary Chinese Novels: The Case of Yan Lianke

Authors: Thuy Hanh Nguyen Thi

Abstract:

Since 1940, existentialism has penetrated into China and continued to profoundly influence contemporary Chinese literature. By the method of deep reading and text analysis, this article analyzes the existential feeling in Yan Lianke’s novels through various aspects: the Sisyphus senses, the narrative rationalization and the viewpoint of the dead. In addition to pointing out the characteristics of the existential sensation in the writer’s novels, the analysis of the article also provides an insight into the nature and depth of contemporary Chinese society.

Keywords: Yan Lianke, existentialism, the existential feeling, contemporary Chinese literature

Procedia PDF Downloads 141
25643 Analysis and Quantification of Historical Drought for Basin Wide Drought Preparedness

Authors: Joo-Heon Lee, Ho-Won Jang, Hyung-Won Cho, Tae-Woong Kim

Abstract:

Drought is a recurrent climatic feature that occurs in virtually every climatic zone around the world. Korea experiences the drought almost every year at the regional scale mainly during in the winter and spring seasons. Moreover, extremely severe droughts at a national scale also occurred at a frequency of six to seven years. Various drought indices had developed as tools to quantitatively monitor different types of droughts and are utilized in the field of drought analysis. Since drought is closely related with climatological and topographic characteristics of the drought prone areas, the basins where droughts are frequently occurred need separate drought preparedness and contingency plans. In this study, an analysis using statistical methods was carried out for the historical droughts occurred in the five major river basins in Korea so that drought characteristics can be quantitatively investigated. It was also aimed to provide information with which differentiated and customized drought preparedness plans can be established based on the basin level analysis results. Conventional methods which quantifies drought execute an evaluation by applying a various drought indices. However, the evaluation results for same drought event are different according to different analysis technique. Especially, evaluation of drought event differs depend on how we view the severity or duration of drought in the evaluation process. Therefore, it was intended to draw a drought history for the most severely affected five major river basins of Korea by investigating a magnitude of drought that can simultaneously consider severity, duration, and the damaged areas by applying drought run theory with the use of SPI (Standardized Precipitation Index) that can efficiently quantifies meteorological drought. Further, quantitative analysis for the historical extreme drought at various viewpoints such as average severity, duration, and magnitude of drought was attempted. At the same time, it was intended to quantitatively analyze the historical drought events by estimating the return period by derived SDF (severity-duration-frequency) curve for the five major river basins through parametric regional drought frequency analysis. Analysis results showed that the extremely severe drought years were in the years of 1962, 1988, 1994, and 2014 in the Han River basin. While, the extreme droughts were occurred in 1982 and 1988 in the Nakdong river basin, 1994 in the Geumg basin, 1988 and 1994 in Youngsan river basin, 1988, 1994, 1995, and 2000 in the Seomjin river basin. While, the extremely severe drought years at national level in the Korean Peninsula were occurred in 1988 and 1994. The most damaged drought were in 1981~1982 and 1994~1995 which lasted for longer than two years. The return period of the most severe drought at each river basin was turned out to be at a frequency of 50~100 years.

Keywords: drought magnitude, regional frequency analysis, SPI, SDF(severity-duration-frequency) curve

Procedia PDF Downloads 406
25642 Exploring the Difficulties of Acceleration Concept from the Perspective of Historical Textual Analysis

Authors: Yun-Ju Chiu, Feng-Yi Chen

Abstract:

Kinematics is the beginning to learn mechanics in physics course. The concept of acceleration plays an important role in learning kinematics. Teachers usually instruct the conception through the formulas and graphs of kinematics and the well-known law F = ma. However, over the past few decades, a lot of researchers reveal numerous students’ difficulties in learning acceleration. One of these difficulties is that students frequently confuse acceleration with velocity and force. Why is the concept of acceleration so difficult to learn? The aim of this study is to understand the conceptual evolution of acceleration through the historical textual analysis. Text analysis and one-to-one interviews with high school students and teachers are used in this study. This study finds the history of science constructed from textbooks is usually quite different from the real evolution of history. For example, most teachers and students believe that the best-known law F = ma was written down by Newton. The expression of the second law is not F = ma in Newton’s best-known book Principia in 1687. Even after more than one hundred years, a famous Cambridge textbook titled An Elementary Treatise on Mechanics by Whewell of Trinity College did not express this law as F = ma. At that time of Whewell, the early mid-nineteenth century Britain, the concept of acceleration was not only ambiguous but also confused with the concept of force. The process of learning the concept of acceleration is analogous to its conceptual development in history. The study from the perspective of historical textual analysis will promote the understanding of the concept learning difficulties, the development of professional physics teaching, and the improvement of the context of physics textbooks.

Keywords: acceleration, textbooks, mechanics, misconception, history of science

Procedia PDF Downloads 252
25641 Reservoir Characterization using Comparative Petrophysical Testing Approach Acquired with Facies Architecture Properties Analysis

Authors: Axel Priambodo, Dwiharso Nugroho

Abstract:

Studies conducted to map the reservoir properties based on facies architecture in which to determine the distribution of the petrophysical properties and calculate hydrocarbon reserves in study interval. Facies Architecture analysis begins with stratigraphic correlation that indicates the area is divided into different system tracts. The analysis of distribution patterns and compiling core analysis with facies architecture model show that there are three estuarine facies appear. Formation evaluation begins with shale volume calculation using Asquith-Krygowski and Volan Triangle Method. Proceed to the calculation of the total and effective porosity using the Bateman-Konen and Volan Triangle Method. After getting the value of the porosity calculation was continued to determine the effective water saturation and non-effective by including parameters of water resistivity and resistivity clay. The results of the research show that the Facies Architecture on the field in divided into three main facies which are Estuarine Channel, Estuarine Sand Bar, and Tidal Flat. The petrophysics analysis are done by comparing different methods also shows that the Volan Triangle Method does not give a better result of the Volume Shale than the Gamma Ray Method, but on the other hand, the Volan Triangle Methode is better on calculating porosity compared to the Bateman-Konen Method. The effective porosity distributions are affected by the distribution of the facies. Estuarine Sand Bar has a low porosity number and Estuarine Channel has a higher number of the porosity. The effective water saturation is controlled by structure where on the closure zone the water saturation is lower than the area beneath it. It caused by the hydrocarbon accumulation on the closure zone.

Keywords: petrophysics, geology, petroleum, reservoir

Procedia PDF Downloads 330
25640 Executive Function in Youth With ADHD and ASD: A Systematic Review and Meta-analysis

Authors: Parker Townes, Prabdeep Panesar, Chunlin Liu, Soo Youn Lee, Dan Devoe, Paul D. Arnold, Jennifer Crosbie, Russell Schachar

Abstract:

Attention-deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD) are impairing childhood neurodevelopmental disorders with problems in executive functions. Executive functions are higher-level mental processes essential for daily functioning and goal attainment. There is genetic and neural overlap between ADHD and ASD. The aim of this meta-analysis was to evaluate if pediatric ASD and ADHD have distinct executive function profiles. This review was completed following Cochrane guidelines. Fifty-eight articles were identified through database searching, followed by a blinded screening in duplicate. A meta-analysis was performed for all task performance metrics evaluated by at least two articles. Forty-five metrics from 24 individual tasks underwent analysis. No differences were found between youth with ASD and ADHD in any domain under direct comparison. However, individuals with ASD and ADHD exhibited deficient attention, flexibility, visuospatial abilities, working memory, processing speed, and response inhibition compared to controls. No deficits in planning were noted in either disorder. Only 11 studies included a group with comorbid ASD+ADHD, making it difficult to determine whether common executive function deficits are a function of comorbidity. Further research is needed to determine if comorbidity accounts for the apparent commonality in executive function between ASD and ADHD.

Keywords: autism spectrum disorder, ADHD, neurocognition, executive function, youth

Procedia PDF Downloads 76
25639 Determination of Full Energy Peak Efficiency and Resolution of Nai (Tl) Detector Using Gamma-ray Spectroscopy

Authors: Jibon Sharma, Alakjyoti Patowary, Moirangthem Nara Singh

Abstract:

In experimental research it is very much essential to obtain the quality control of the system used for the experiment. NaI (Tl) scintillation detector is the most commonly used in radiation and medical physics for measurement of the gamma ray activity of various samples. In addition, the scintillation detector has a lot of applications in the elemental analysis of various compounds, alloys using activation analysis. In each application for quantitative analysis, it is very much essential to know the detection efficiency and resolution for different gamma energies. In this work, the energy dependence of efficiency and resolution of NaI (Tl) detector using gamma-ray spectroscopy are investigated. Different photon energies of 356.01 keV,511keV,661.60keV,1170 keV,1274.53 keV and 1330 keV are obtained from four radioactive sources (133Ba,22Na,137Cs and 60 Co) used in these studies. Values of full energy peak efficiencies of these gamma energies are found to be respectively 58.46%,10.15%,14.39%,1.4%,3.27% and 1.31%. The values of percent resolution for above different gamma ray energies are found to be 11.27%,7.27%,6.38%,5.17%,4.86% and 4.74% respectively. It was found that the efficiency of the detector exponentially decreases with energy and the resolution of the detector is directly proportional to the energy of gamma-ray.

Keywords: naI (Tl) gamma-ray spectrometer, resolution, full energy peak efficiency, radioactive sources

Procedia PDF Downloads 104
25638 The Prediction of Sound Absorbing Coefficient for Multi-Layer Non-Woven

Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Gyu Park

Abstract:

Automotive interior material consisting of several material layers has the sound-absorbing function. It is difficult to predict sound absorbing coefficient because of several material layers. So, many experimental tunings are required to achieve the target of sound absorption. Therefore, while the car interior materials are developed, so much time and money is spent. In this study, we present a method to predict the sound absorbing performance of the material with multi-layer using physical properties of each material. The properties are predicted by Foam-X software using the sound absorption coefficient data measured by impedance tube. Then, we will compare and analyze the predicted sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If the method is used instead of experimental tuning in the development of car interior material, the time and money can be saved, and then, the development effort can be reduced because it can be optimized by simulation.

Keywords: multi-layer nonwoven, sound absorption coefficient, scaled reverberation chamber, impedance tubes

Procedia PDF Downloads 376
25637 Modelling Suspended Solids Transport in Dammam (Saudi Arabia) Coastal Areas

Authors: Hussam Alrabaiah

Abstract:

Some new projects (new proposed harbor, recreational projects) are considered in the eastern coasts of Dammam city, Saudi Arabia. Dredging operations would significantly alter coast hydrological and sediment transport processes. It is important that the project areas must keep flushing the fresh sea water in and out with good water quality parameters, which are currently facing increased pressure from urbanization and navigation requirements in conjunction with industrial developments. A suspended solids or sediments are expected to affect the flora and fauna in that area. Governing advection-diffusion equations are considered to understand the consequences of such projects. A numerical modeling study is developed to study the effect of dredging and, in particular, the suspended sediments concentrations (mg/L) changed in the region. The results were obtained using finite element method using an in-house or commercial software. Results show some consistency with data observed in that region. Recommendations based on results could be formulated for decision makers to protect the environment in the long term.

Keywords: finite element, method, suspended solids transport, advection-diffusion

Procedia PDF Downloads 284
25636 A Study of Using Different Printed Circuit Board Design Methods on Ethernet Signals

Authors: Bahattin Kanal, Nursel Akçam

Abstract:

Data transmission size and frequency requirements are increasing rapidly in electronic communication protocols. Increasing data transmission speeds have made the design of printed circuit boards much more important. It is important to carefully examine the requirements and make analyses before and after the design of the digital electronic circuit board. It delves into impedance matching techniques, signal trace routing considerations, and the impact of layer stacking on signal performance. The paper extensively explores techniques for minimizing crosstalk issues and interference, presenting a holistic perspective on design strategies to optimize the quality of high-speed signals. Through a comprehensive review of these design methodologies, this study aims to provide insights into achieving reliable and high-performance printed circuit board layouts for these signals. In this study, the effect of different design methods on Ethernet signals was examined from the type of S parameters. Siemens company HyperLynx software tool was used for the analyses.

Keywords: HyperLynx, printed circuit board, s parameters, ethernet

Procedia PDF Downloads 34
25635 Analysis of Flux-Linkage Performance of DFIG by Using Simulink under Different Types of Faults and Locations

Authors: Mohamed Moustafa Mahmoud Sedky

Abstract:

The double-fed induction generator wind turbine has recently received a great attention. The steady state performance and response of double fed induction generator (DFIG) based wind turbine are now well understood. This paper presents the analysis of stator and rotor flux linkage dq models operation of DFIG under different faults and at different locations.

Keywords: double fed induction motor, wind energy, flux linkage, short circuit

Procedia PDF Downloads 588
25634 Nonlinear Multivariable Analysis of CO2 Emissions in China

Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu

Abstract:

This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.

Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis

Procedia PDF Downloads 403
25633 The Story of a Spoiled Identity: Blogging on Disability and Feminity

Authors: Anna Ślebioda

Abstract:

The paper discusses intersections between disability and femininity. Their imbrication may impede negotiation of identity. The analysis of a blog of a women with disability aims to prove this hypothesis. It involves 724 entries written in the span of six years. The conceptual framework for the considerations constitute the concepts of stigma and spoiled identity, and overlapping elements of femininity and disability. The empirical part comprises content analysis. It allows to locate the narrative on femininity and disability within the dimensions of imbricated categories described in the theoretical part. The results demonstrate aspects to consider in further research on identity in women with disabilities.

Keywords: disability, femininity, spoiled identity, stigma

Procedia PDF Downloads 665
25632 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.

Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)

Procedia PDF Downloads 86
25631 A Statistical Energy Analysis Model of an Automobile for the Prediction of the Internal Sound Pressure Level

Authors: El Korchi Ayoub, Cherif Raef

Abstract:

Interior noise in vehicles is an essential factor affecting occupant comfort. Over recent decades, much work has been done to develop simulation tools for vehicle NVH. At the medium high-frequency range, the statistical energy analysis method (SEA) shows significant effectiveness in predicting noise and vibration responses of mechanical systems. In this paper, the evaluation of the sound pressure level (SPL) inside an automobile cabin has been performed numerically using the statistical energy analysis (SEA) method. A test car cabin was performed using a monopole source as a sound source. The decay rate method was employed to obtain the damping loss factor (DLF) of each subsystem of the developed SEA model. These parameters were then used to predict the sound pressure level in the interior cabin. The results show satisfactory agreement with the directly measured SPL. The developed SEA vehicle model can be used in early design phases and allows the engineer to identify sources contributing to the total noise and transmission paths.

Keywords: SEA, SPL, DLF, NVH

Procedia PDF Downloads 91
25630 The Effects of Weather Events and Land Use Change on Urban Ecosystems: From Risk to Resilience

Authors: Szu-Hua Wang

Abstract:

Urban ecosystems, as complex coupled human-environment systems, contain abundant natural resources for breeding natural assets and, at the same time, attract urban assets and consume natural resources, triggered by urban development. Land use change illustrates the interaction between human activities and environments factually. However, IPCC (2014) announces that land use change and urbanization due to human activities are the major cause of climate change, leading to serious impacts on urban ecosystem resilience and risk. For this reason, risk assessment and resilience analysis are the keys for responding to climate change on urban ecosystems. Urban spatial planning can guide urban development by land use planning, transportation planning, and environmental planning and affect land use allocation and human activities by building major constructions and protecting important national land resources simultaneously. Urban spatial planning can aggravate climate change and, on the other hand, mitigate and adapt climate change. Research on effects of spatial planning on land use change and climate change is one of intense issues currently. Therefore, this research focuses on developing frameworks for risk assessment and resilience analysis from the aspect of ecosystem based on typhoon precipitation in Taipei area. The integrated method of risk assessment and resilience analysis will be also addressed for applying spatial planning practice and sustainable development.

Keywords: ecosystem, land use change, risk analysis, resilience

Procedia PDF Downloads 417
25629 Design of a Graphical User Interface for Data Preprocessing and Image Segmentation Process in 2D MRI Images

Authors: Enver Kucukkulahli, Pakize Erdogmus, Kemal Polat

Abstract:

The 2D image segmentation is a significant process in finding a suitable region in medical images such as MRI, PET, CT etc. In this study, we have focused on 2D MRI images for image segmentation process. We have designed a GUI (graphical user interface) written in MATLABTM for 2D MRI images. In this program, there are two different interfaces including data pre-processing and image clustering or segmentation. In the data pre-processing section, there are median filter, average filter, unsharp mask filter, Wiener filter, and custom filter (a filter that is designed by user in MATLAB). As for the image clustering, there are seven different image segmentations for 2D MR images. These image segmentation algorithms are as follows: PSO (particle swarm optimization), GA (genetic algorithm), Lloyds algorithm, k-means, the combination of Lloyds and k-means, mean shift clustering, and finally BBO (Biogeography Based Optimization). To find the suitable cluster number in 2D MRI, we have designed the histogram based cluster estimation method and then applied to these numbers to image segmentation algorithms to cluster an image automatically. Also, we have selected the best hybrid method for each 2D MR images thanks to this GUI software.

Keywords: image segmentation, clustering, GUI, 2D MRI

Procedia PDF Downloads 377