Search results for: standard procedures process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20156

Search results for: standard procedures process

18746 Difficulties Encountered in the Process of Supporting Reading Skills of a Student with Hearing Loss Whose Inclusion Was Ongoing and Solution Proposals

Authors: Ezgi Tozak, H. Pelin Karasu, Umit Girgin

Abstract:

In this study, difficulties encountered in the process of supporting the reading skills of a student with hearing loss whose inclusion was ongoing and the solutions improved during the practice process were examined. The study design was action research. Participants of this study, which was conducted between the dates of 29 September 2016 and 22 February 2017, consisted of a student with hearing loss, a classroom teacher, a teacher in the rehabilitation center, researcher/teacher and validity committee members. The data were obtained through observations, validity committee meeting, interviews, documents, and the researcher diary. Research findings show that in the process of supporting reading skills of the student with hearing loss, the student's knowledge of concepts was limited, and the student had difficulties in feeling and identification of sounds, reading and understanding words-sentences and retelling what he/she listened to. With the purpose of overcoming these difficulties in the implementation process, activities were prepared towards concepts, sound education, reading and understanding words and sentences, and retelling what you listen to; these activities were supported with visual materials and real objects and repeated with diversities.

Keywords: inclusion, reading process, supportive education, student with hearing loss

Procedia PDF Downloads 129
18745 Online Monitoring Rheological Property of Polymer Melt during Injection Molding

Authors: Chung-Chih Lin, Chien-Liang Wu

Abstract:

The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.

Keywords: injection molding, melt viscosity, shear rate, monitoring

Procedia PDF Downloads 367
18744 Dual-Rail Logic Unit in Double Pass Transistor Logic

Authors: Hamdi Belgacem, Fradi Aymen

Abstract:

In this paper we present a low power, low cost differential logic unit (LU). The proposed LU receives dual-rail inputs and generates dual-rail outputs. The proposed circuit can be used in Arithmetic and Logic Units (ALU) of processor. It can be also dedicated for self-checking applications based on dual duplication code. Four logic functions as well as their inverses are implemented within a single Logic Unit. The hardware overhead for the implementation of the proposed LU is lower than the hardware overhead required for standard LU implemented with standard CMOS logic style. This new implementation is attractive as fewer transistors are required to implement important logic functions. The proposed differential logic unit can perform 8 Boolean logical operations by using only 16 transistors. Spice simulations using a 32 nm technology was utilized to evaluate the performance of the proposed circuit and to prove its acceptable electrical behaviour.

Keywords: differential logic unit, double pass transistor logic, low power CMOS design, low cost CMOS design

Procedia PDF Downloads 435
18743 Surface Water Quality in Orchard Area, Amphawa District, Samut Songkram Province, Thailand

Authors: Sisuwan Kaseamsawat, Sivapan Choo-In

Abstract:

This study aimed to evaluated the surface water quality for agriculture and consumption in the district. Surface water quality parameters in this study in cluding water temperature, turbidity, conductivity. salinity, pH, dissolved oxygen, BOD, nitrate, Suspended solids, phosphorus. Total dissolve solids, iron, copper, zinc, manganese, lead and cadmium. Water samples were collected from small excavation, Lychee, Pomelo, and Coconut orchard for 3 season during January to December 2011. The surface water quality from small excavation, Lychee, pomelo, and coconut orchard are meet the type III of surface water quality standard issued by the National Environmental Quality Act B. E. 1992. except the concentration of heavy metal. And did not differ significantly at 0.05 level, except dissolved oxygen. The water is suitable for consumption by the usual sterile and generally improving water quality through the process before. And is suitable for agriculture.

Keywords: water quality, surface water quality, Thailand, water

Procedia PDF Downloads 338
18742 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods

Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo

Abstract:

The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.

Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines

Procedia PDF Downloads 600
18741 CMMI Key Process Areas and FDD Practices

Authors: Rituraj Deka, Nomi Baruah

Abstract:

The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.

Keywords: Agile, CMMI, FDD, KPAs

Procedia PDF Downloads 441
18740 The Willingness to Pay of People in Taiwan for Flood Protection Standard of Regions

Authors: Takahiro Katayama, Hsueh-Sheng Chang

Abstract:

Due to the global climate change, it has increased the extreme rainfall that led to serious floods around the world. In recent years, urbanization and population growth also tend to increase the number of impervious surfaces, resulting in significant loss of life and property during floods especially for the urban areas of Taiwan. In the past, the primary governmental response to floods was structural flood control and the only flood protection standards in use were the design standards. However, these design standards of flood control facilities are generally calculated based on current hydrological conditions. In the face of future extreme events, there is a high possibility to surpass existing design standards and cause damages directly and indirectly to the public. To cope with the frequent occurrence of floods in recent years, it has been pointed out that there is a need for a different standard called FPSR (Flood Protection Standard of Regions) in Taiwan. FPSR is mainly used for disaster reduction and used to ensure that hydraulic facilities draining regional flood immediately under specific return period. FPSR could convey a level of flood risk which is useful for land use planning and reflect the disaster situations that a region can bear. However, little has been reported on FPSR and its impacts to the public in Taiwan. Hence, this study proposes a quantity procedure to evaluate the FPSR. This study aimed to examine FPSR of the region and public perceptions of and knowledge about FPSR, as well as the public’s WTP (willingness to pay) for FPSR. The research is conducted via literature review and questionnaire method. Firstly, this study will review the domestic and international research on the FPSR, and provide the theoretical framework of FPSR. Secondly, CVM (Contingent Value Method) has been employed to conduct this survey and using double-bounded dichotomous choice, close-ended format elicits households WTP for raising the protection level to understand the social costs. The samplings of this study are citizens living in Taichung city, Taiwan and 700 samplings were chosen in this study. In the end, this research will continue working on surveys, finding out which factors determining WTP, and provide some recommendations for adaption policies for floods in the future.

Keywords: climate change, CVM (Contingent Value Method), FPSR (Flood Protection Standard of Regions), urban flooding

Procedia PDF Downloads 231
18739 Interior Design Pedagogy in the 21st Century: Personalised Design Process

Authors: Roba Zakariah Shaheen

Abstract:

In the 21st-century Interior, design pedagogy has developed rapidly due to social and economical factors. Socially, this paper presents research findings that shows a significant relationship between educators and students in interior design education. It shows that students’ personal traits, design process, and thinking process are significantly interrelated. Constructively, this paper presented how personal traits can guide educators in the interior design education domain to develop students’ thinking process. In the same time, it demonstrated how students should use their own personal traits to create their own design process. Constructivism was the theory underneath this research, as it supports the grounded theory, which is the methodological approach of this research. Moreover, Mayer’s Briggs Type Indicator strategy was used to investigate the personality traits scientifically, as a psychological strategy that related to cognitive ability. Conclusions from this research strongly recommends that educators and students should utilize their personal traits to foster interior design education.

Keywords: interior design, pedagogy, constructivism, grounded theory, personality traits, creativity

Procedia PDF Downloads 190
18738 Transfer Knowledge From Multiple Source Problems to a Target Problem in Genetic Algorithm

Authors: Terence Soule, Tami Al Ghamdi

Abstract:

To study how to transfer knowledge from multiple source problems to the target problem, we modeled the Transfer Learning (TL) process using Genetic Algorithms as the model solver. TL is the process that aims to transfer learned data from one problem to another problem. The TL process aims to help Machine Learning (ML) algorithms find a solution to the problems. The Genetic Algorithms (GA) give researchers access to information that we have about how the old problem is solved. In this paper, we have five different source problems, and we transfer the knowledge to the target problem. We studied different scenarios of the target problem. The results showed combined knowledge from multiple source problems improves the GA performance. Also, the process of combining knowledge from several problems results in promoting diversity of the transferred population.

Keywords: transfer learning, genetic algorithm, evolutionary computation, source and target

Procedia PDF Downloads 125
18737 Physicochemical Analysis of Ground Water of Selected Areas of Oji River in Enugu State, Nigeria

Authors: C. Akpagu Francis, V. Nnamani Emmanuel

Abstract:

Drinking and use of polluted water from ponds, rivers, lakes, etc. for other domestic activities especially by the larger population in the rural areas has been a major source of health problems to man. A study was carried out in two different ponds in Oji River, Enugu State of Nigeria to determine the extent of total dissolved solid (TDS), metals (lead, cadmium, iron, zinc, manganese, calcium), biochemical oxygen demand (BOD). Samples of water were collected from two different ponds at a distance of 510, and 15 metres from the point of entry into the ponds to fetch water. From the results obtained, TDS (751.6Mg/l), turbidity (24ftu), conductivity (1193µs/cm), cadmium (0.008Mg/l) and lead (0.03mg/t) in pond A (PA) were found to have exceeded the WHO standard. Also in pond B (PB) the results shows that TDS (760.30Mg/l), turbidity (26ftu), conductivity (1195µs/cm), cadmium (0.008mg/l) and lead (0.03Mg/l) were also found to have exceeded the WHO standard which makes the two ponds. Water very unsafe for drinking and use in other domestic activities.

Keywords: physicochemical, groundwater, Oji River, Nigeria

Procedia PDF Downloads 440
18736 Residual Life Estimation Based on Multi-Phase Nonlinear Wiener Process

Authors: Hao Chen, Bo Guo, Ping Jiang

Abstract:

Residual life (RL) estimation based on multi-phase nonlinear Wiener process was studied in this paper, which is significant for complicated products with small samples. Firstly, nonlinear Wiener model with random parameter was introduced and multi-phase nonlinear Wiener model was proposed to model degradation process of products that were nonlinear and separated into different phases. Then the multi-phase RL probability density function based on the presented model was derived approximately in a closed form and parameters estimation was achieved with the method of maximum likelihood estimation (MLE). Finally, the method was applied to estimate the RL of high voltage plus capacitor. Compared with the other three different models by log-likelihood function (Log-LF) and Akaike information criterion (AIC), the results show that the proposed degradation model can capture degradation process of high voltage plus capacitors in a better way and provide a more reliable result.

Keywords: multi-phase nonlinear wiener process, residual life estimation, maximum likelihood estimation, high voltage plus capacitor

Procedia PDF Downloads 439
18735 Labour Standards and Bilateral Migration Flows in ASEAN

Authors: Rusmawati Said, N. Kar Yee, Asmaddy Haris

Abstract:

This study employs a panel data set of ASEAN member states, 17 European Union (EU) countries, 7 American countries and 11 other Asia Pacific countries (China Mainland and Hong Kong SAR are treated as two separated countries) to investigate the role of labour standards in explaining the pattern of bilateral migration flows in ASEAN. Using pooled Ordinary Least Square (OLS) this study found mixed results. The result varies on how indicators were used to measure the level of labour standards in the empirical analysis. In one side, better labour standards (represented by number of strikes and weekly average working hours) promote bilateral migration among the selected countries. On the other side, increase in cases of occupational injuries lead to an increase in bilateral migration, reflecting that worsen in working conditions do not influence the workers’ decision from moving. The finding from this study become important to policy maker as the issues of massive low skilled workers have a significant impact to the role of labour standard in shaping the migration flows.

Keywords: labour standard, migration, ASEAN, economics and financial engineering

Procedia PDF Downloads 384
18734 Assessment of Hargreaves Equation for Estimating Monthly Reference Evapotranspiration in the South of Iran

Authors: Ali Dehgan Moroozeh, B. Farhadi Bansouleh

Abstract:

Evapotranspiration is one of the most important components of the hydrological cycle. Evapotranspiration (ETo) is an important variable in water and energy balances on the earth’s surface, and knowledge of the distribution of ET is a key factor in hydrology, climatology, agronomy and ecology studies. Many researchers have a valid relationship, which is a function of climate factors, to estimate the potential evapotranspiration presented to the plant water stress or water loss, prevent. The FAO-Penman method (PM) had been recommended as a standard method. This method requires many data and these data are not available in every area of world. So, other methods should be evaluated for these conditions. When sufficient or reliable data to solve the PM equation are not available then Hargreaves equation can be used. The Hargreaves equation (HG) requires only daily mean, maximum and minimum air temperature extraterrestrial radiation .In this study, Hargreaves method (HG) were evaluated in 12 stations in the North West region of Iran. Results of HG and M.HG methods were compared with results of PM method. Statistical analysis of this comparison showed that calibration process has had significant effect on efficiency of Hargreaves method.

Keywords: evapotranspiration, hargreaves, equation, FAO-Penman method

Procedia PDF Downloads 383
18733 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection

Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar

Abstract:

Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.

Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE

Procedia PDF Downloads 348
18732 Techno-Economic Assessment of Aluminum Waste Management

Authors: Hamad Almohamadi, Abdulrahman AlKassem, Majed Alamoudi

Abstract:

Dumping Aluminum (Al) waste into landfills causes several health and environmental problems. The pyrolysis process could treat Al waste to produce AlCl₃ and H₂. Using the Aspen Plus software, a techno-economic and feasibility assessment has been performed for Al waste pyrolysis. The Aspen Plus simulation was employed to estimate the plant's mass and energy balance, which was assumed to process 100 dry metric tons of Al waste per day. This study looked at two cases of Al waste treatment. The first case produces 355 tons of AlCl₃ per day and 9 tons of H₂ per day without recycling. The conversion rate must be greater than 50% in case 1 to make a profit. In this case, the MSP for AlCl₃ is $768/ton. The plant would generate $25 million annually if the AlCl₃ were sold at $1000 per ton. In case 2 with recycling, the conversion has less impact on the plant's profitability than in case 1. Moreover, compared to case 1, the MSP of AlCl₃ has no significant influence on process profitability. In this scenario, if AlCl₃ were sold at $1000/ton, the process profit would be $58 million annually. Case 2 is better than case 1 because recycling Al generates a higher yield than converting it to AlCl₃ and H₂.

Keywords: aluminum waste, aspen plus, process modelling, fast pyrolysis, techno-economic assessment

Procedia PDF Downloads 71
18731 The Effect of Tacit Knowledge for Intelligence Cycle

Authors: Bahadir Aydin

Abstract:

It is difficult to access accurate knowledge because of mass data. This huge data make environment more and more caotic. Data are main piller of intelligence. The affiliation between intelligence and knowledge is quite significant to understand underlying truths. The data gathered from different sources can be modified, interpreted and classified by using intelligence cycle process. This process is applied in order to progress to wisdom as well as intelligence. Within this process the effect of tacit knowledge is crucial. Knowledge which is classified as explicit and tacit knowledge is the key element for any purpose. Tacit knowledge can be seen as "the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence cycle is scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose of all organizations is to be successful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. Thanks to this process the decision-makers can be presented with a clear holistic understanding, as early as possible in the decision making process. Altering from the current traditional reactive approach to a proactive intelligence cycle approach would reduce extensive duplication of work in the organization. Applying new result-oriented cycle and tacit knowledge intelligence can be procured and utilized more effectively and timely.

Keywords: information, intelligence cycle, knowledge, tacit Knowledge

Procedia PDF Downloads 504
18730 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations

Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li

Abstract:

The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.

Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling

Procedia PDF Downloads 139
18729 Exploring Open Process Innovation: Insights from a Systematic Review and Framework Development

Authors: Saeed Nayeri

Abstract:

This paper explores the feasibility of openness within firms' boundaries during process innovation and identifies the key determinants of open process innovation (OPI). Through a systematic review of 78 research studies published between 2001 and 2024, the author synthesized diverse findings into a comprehensive framework detailing OPI attributes and pillars. The identified OPI attributes encompass themes such as technology intensity, significance, magnitude, and locus of exploitation, while the OPI pillars include mechanisms, partners, achievements, and antecedents. Additionally, the author critically analysed gaps in the literature, proposing future research directions that advocate for a broader methodological approach, increased emphasis on theory development and testing, and more cross-national and cross-sectoral studies to advance understanding in this field.

Keywords: open innovation, process innovation, OPI attributes, systematic literature review, organizational openness

Procedia PDF Downloads 43
18728 Analyzing the Quality of Cloud-Based E-Learning Systems on the Perception of the Learners and the Teachers

Authors: R. W. C. Devindi, S. M. Buddika Harshanath

Abstract:

E-learning is a widely used technology for learning in the modern world. With the pandemic situation the popularity of using e-learning has been increased in a larger capacity. The e-learning educational systems require software resources as well as hardware usually but it is hard for most of the education institutions to afford those resources. Also with the massive user load e-learning has to broaden the server side resources as well. Therefore, in the present cloud computing was implemented in order to make the e – learning systems more efficient. The researcher has analyzed the quality of the e-learning systems on the perception of the learners and the teachers with the aid of hypothesis and has given the analyzed results and the discussion in this report. Therefore, the future research will be able to get some steps to increase the quality of the online learning systems furthermore. In the case of e-learning, quality assurance and cost effectiveness are essential. A complex quality assurance system is used in the stated project. There are no well-defined standard evaluation measures in this field. As a result, accurately assessing the e-learning system's overall quality is challenging. The researcher has done the analysis with the aid of standard methods and software.

Keywords: LMS–learning management system, SPSS–statistical package for social sciences (software), eigen value, hypothesis

Procedia PDF Downloads 92
18727 D6tions: A Serious Game to Learn Software Engineering Process and Design

Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, 
 Francisco E. Martinez-Perez, Alberto S. Nunez-Varela

Abstract:

The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.

Keywords: serious games, software engineering, software engineering education, software engineering teaching process

Procedia PDF Downloads 475
18726 In-Process Integration of Resistance-Based, Fiber Sensors during the Braiding Process for Strain Monitoring of Carbon Fiber Reinforced Composite Materials

Authors: Oscar Bareiro, Johannes Sackmann, Thomas Gries

Abstract:

Carbon fiber reinforced polymer composites (CFRP) are used in a wide variety of applications due to its advantageous properties and design versatility. The braiding process enables the manufacture of components with good toughness and fatigue strength. However, failure mechanisms of CFRPs are complex and still present challenges associated with their maintenance and repair. Within the broad scope of structural health monitoring (SHM), strain monitoring can be applied to composite materials to improve reliability, reduce maintenance costs and safely exhaust service life. Traditional SHM systems employ e.g. fiber optics, piezoelectrics as sensors, which are often expensive, time consuming and complicated to implement. A cost-efficient alternative can be the exploitation of the conductive properties of fiber-based sensors such as carbon, copper, or constantan - a copper-nickel alloy – that can be utilized as sensors within composite structures to achieve strain monitoring. This allows the structure to provide feedback via electrical signals to a user which are essential for evaluating the structural condition of the structure. This work presents a strategy for the in-process integration of resistance-based sensors (Elektrisola Feindraht AG, CuNi23Mn, Ø = 0.05 mm) into textile preforms during its manufacture via the braiding process (Herzog RF-64/120) to achieve strain monitoring of braided composites. For this, flat samples of instrumented composite laminates of carbon fibers (Toho Tenax HTS40 F13 24K, 1600 tex) and epoxy resin (Epikote RIMR 426) were manufactured via vacuum-assisted resin infusion. These flat samples were later cut out into test specimens and the integrated sensors were wired to the measurement equipment (National Instruments, VB-8012) for data acquisition during the execution of mechanical tests. Quasi-static tests were performed (tensile, 3-point bending tests) following standard protocols (DIN EN ISO 527-1 & 4, DIN EN ISO 14132); additionally, dynamic tensile tests were executed. These tests were executed to assess the sensor response under different loading conditions and to evaluate the influence of the sensor presence on the mechanical properties of the material. Several orientations of the sensor with regards to the applied loading and sensor placements inside the laminate were tested. Strain measurements from the integrated sensors were made by programming a data acquisition code (LabView) written for the measurement equipment. Strain measurements from the integrated sensors were then correlated to the strain/stress state for the tested samples. From the assessment of the sensor integration approach it can be concluded that it allows for a seamless sensor integration into the textile preform. No damage to the sensor or negative effect on its electrical properties was detected during inspection after integration. From the assessment of the mechanical tests of instrumented samples it can be concluded that the presence of the sensors does not alter significantly the mechanical properties of the material. It was found that there is a good correlation between resistance measurements from the integrated sensors and the applied strain. It can be concluded that the correlation is of sufficient accuracy to determinate the strain state of a composite laminate based solely on the resistance measurements from the integrated sensors.

Keywords: braiding process, in-process sensor integration, instrumented composite material, resistance-based sensor, strain monitoring

Procedia PDF Downloads 91
18725 Colonialism and Modernism in Architecture, the Case of a Blank Page Opportunity in Casablanka

Authors: Nezha Alaoui

Abstract:

The early 1950s French colonial context in Morocco provided an opportunity for architects to question the modernist established order by building dwellings for the local population. The dwellings were originally designed to encourage Muslims to adopt an urban lifestyle based on local customs. However, the inhabitants transformed their dwelling into a hybrid habitation. This paper aims to prove the relevance of the design process in accordance with the local colonial context by analyzing the dwellers' appropriation process and the modification of their habitat.

Keywords: colonial heritage, appropriation process, islamic spatial habit, housing experiment, modernist mass housing

Procedia PDF Downloads 117
18724 Science Process Skill and Interest Preschooler in Learning Early Science through Mobile Application

Authors: Seah Siok Peh, Hashimah Mohd Yunus, Nor Hashimah Hashim, Mariam Mohamad

Abstract:

A country needs a workforce that encompasses knowledge, skilled labourers to generate innovation, productivity and being able to solve problems creatively via technology. Science education experts believe that the mastery of science skills help preschoolers to generate such knowledge on scientific concepts by providing constructive experiences. Science process skills are skills used by scientists to study or investigate a problem, issue, problem or phenomenon of science. In line with the skills used by scientists. The purpose of this study is to investigate the basic science process skill and interest in learning early science through mobile application. This study aimed to explore six spesific basic science process skills by the use of a mobile application as a learning support tool. The descriptive design also discusses on the extent of the use of mobile application in improving basic science process skill in young children. This study consists of six preschoolers and two preschool teachers from two different classes located in Perak, Malaysia. Techniques of data collection are inclusive of observations, interviews and document analysis. This study will be useful to provide information and give real phenomena to policy makers especially Ministry of education in Malaysia.

Keywords: science education, basic science process skill, interest, early science, mobile application

Procedia PDF Downloads 230
18723 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 128
18722 CFD Study on the Effect of Primary Air on Combustion of Simulated MSW Process in the Fixed Bed

Authors: Rui Sun, Tamer M. Ismail, Xiaohan Ren, M. Abd El-Salam

Abstract:

Incineration of municipal solid waste (MSW) is one of the key scopes in the global clean energy strategy. A computational fluid dynamics (CFD) model was established. In order to reveal these features of the combustion process in a fixed porous bed of MSW. Transporting equations and process rate equations of the waste bed were modeled and set up to describe the incineration process, according to the local thermal conditions and waste property characters. Gas phase turbulence was modeled using k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The heterogeneous reaction rates were determined using Arrhenius eddy dissipation and the Arrhenius-diffusion reaction rates. The effects of primary air flow rate and temperature in the burning process of simulated MSW are investigated experimentally and numerically. The simulation results in bed are accordant with experimental data well. The model provides detailed information on burning processes in the fixed bed, which is otherwise very difficult to obtain by conventional experimental techniques.

Keywords: computational fluid dynamics (CFD) model, waste incineration, municipal solid waste (MSW), fixed bed, primary air

Procedia PDF Downloads 387
18721 Depyritization of US Coal Using Iron-Oxidizing Bacteria: Batch Stirred Reactor Study

Authors: Ashish Pathak, Dong-Jin Kim, Haragobinda Srichandan, Byoung-Gon Kim

Abstract:

Microbial depyritization of coal using chemoautotrophic bacteria is gaining acceptance as an efficient and eco-friendly technique. The process uses the metabolic activity of chemoautotrophic bacteria in removing sulfur and pyrite from the coal. The aim of the present study was to investigate the potential of Acidithiobacillus ferrooxidans in removing the pyritic sulfur and iron from high iron and sulfur containing US coal. The experiment was undertaken in 8 L bench scale stirred tank reactor having 1% (w/v) pulp density of coal. The reactor was operated at 35ºC and aerobic conditions were maintained by sparging the air into the reactor. It was found that at the end of bio-depyritization process, about 90% of pyrite and 67% of pyritic sulfur was removed from the coal. The results indicate that the bio-depyritization process is an efficient process in treating the high pyrite and sulfur containing coal.

Keywords: At.ferrooxidans, batch reactor, coal desulfurization, pyrite

Procedia PDF Downloads 257
18720 A Time-Varying and Non-Stationary Convolution Spectral Mixture Kernel for Gaussian Process

Authors: Kai Chen, Shuguang Cui, Feng Yin

Abstract:

Gaussian process (GP) with spectral mixture (SM) kernel demonstrates flexible non-parametric Bayesian learning ability in modeling unknown function. In this work a novel time-varying and non-stationary convolution spectral mixture (TN-CSM) kernel with a significant enhancing of interpretability by using process convolution is introduced. A way decomposing the SM component into an auto-convolution of base SM component and parameterizing it to be input dependent is outlined. Smoothly, performing a convolution between two base SM component yields a novel structure of non-stationary SM component with much better generalized expression and interpretation. The TN-CSM perfectly allows compatibility with the stationary SM kernel in terms of kernel form and spectral base ignored and confused by previous non-stationary kernels. On synthetic and real-world datatsets, experiments show the time-varying characteristics of hyper-parameters in TN-CSM and compare the learning performance of TN-CSM with popular and representative non-stationary GP.

Keywords: Gaussian process, spectral mixture, non-stationary, convolution

Procedia PDF Downloads 177
18719 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 93
18718 Pakistan’s Taxation System: A Critical Appraisal

Authors: Khalid Javed, Rashid Mahmood

Abstract:

The constitution empowers the Federal Government to collect taxes on income other than agricultural income, taxes on capital value, customs, excise duties and sales taxes. The Central Board of Revenue (CBR) and its subordinate departments administer the tax system. Each of the three principal taxes has a different history and different set of issues. For a large number of income tax payers the core of the business process is pre-audit and assessment by a tax official. This process gives considerable discretion to tax officials, with potential for abuse. Moreover, this process is also not tenable as the number of taxpayers increase. The report is focused on a total overhaul of the process and organization of income tax. Sales tax is recent and its process and organization is adjusted to the needs of an expanding tax base. These are based on self-assessment and selective audit. Similarly, in customs the accent is on accelerating and broadening the changes begun in recent years. Before long, central excise will be subsumed in sales tax. During the nineties, despite many changes in the tax regime and introduction of withholding and presumptive taxes, Federal Government tax to GDP ratio has varied narrowly around eleven percent. The tax base has grown but still remains narrow and skewed. The number of income tax filers is around one million.

Keywords: central board of revenue, GDP, sale tax, income tax

Procedia PDF Downloads 419
18717 Experimental Assessment of the Effectiveness of Judicial Instructions and of Expert Testimony in Improving Jurors’ Evaluation of Eyewitness Evidence

Authors: Alena Skalon, Jennifer L. Beaudry

Abstract:

Eyewitness misidentifications can sometimes lead to wrongful convictions of innocent people. This occurs in part because jurors tend to believe confident eyewitnesses even when the identification took place under suggestive conditions. Empirical research demonstrated that jurors are often unaware of the factors that can influence the reliability of eyewitness identification. Most common legal safeguards that are designed to educate jurors about eyewitness evidence are judicial instructions and expert testimony. To date, very few studies assessed the effectiveness of judicial instructions and most of them found that judicial instructions make jurors more skeptical of eyewitness evidence or do not have any effect on jurors’ judgments. Similar results were obtained for expert testimony. However, none of the previous studies focused on the ability of legal safeguards to improve jurors’ assessment of evidence obtained from suggestive identification procedures—this is one of the gaps addressed by this paper. Furthermore, only three studies investigated whether legal safeguards improve the ultimate accuracy of jurors’ judgments—that is, whether after listening to judicial instructions or expert testimony jurors can differentiate between accurate and inaccurate eyewitnesses. This presentation includes two studies. Both studies used genuine eyewitnesses (i.e., eyewitnesses who watched the crime) and manipulated the suggestiveness of identification procedures. The first study manipulated the presence of judicial instructions; the second study manipulated the presence of one of two types of expert testimony: a traditional, verbal expert testimony or expert testimony accompanied by visual aids. All participant watched a video-recording of an identification procedure and of an eyewitness testimony. The results indicated that neither judicial instructions nor expert testimony affected jurors’ judgments. However, consistent with the previous findings, when the identification procedure was non-suggestive, jurors believed accurate eyewitnesses more often than inaccurate eyewitnesses. When the procedure was suggestive, jurors believed accurate and inaccurate eyewitnesses at the same rate. The paper will discuss the implications of these studies and directions for future research.

Keywords: expert testimony, eyewitness evidence, judicial instructions, jurors’ decision making, legal safeguards

Procedia PDF Downloads 164