Search results for: specific nitrification rate
3172 Estimation of Bio-Kinetic Coefficients for Treatment of Brewery Wastewater
Authors: Abimbola M. Enitan, Josiah Adeyemo
Abstract:
Anaerobic modeling is a useful tool to describe and simulate the condition and behaviour of anaerobic treatment units for better effluent quality and biogas generation. The present investigation deals with the anaerobic treatment of brewery wastewater with varying organic loads. The chemical oxygen demand (COD) and total suspended solids (TSS) of the influent and effluent of the bioreactor were determined at various retention times to generate data for kinetic coefficients. The bio-kinetic coefficients in the modified Stover–Kincannon kinetic and methane generation models were determined to study the performance of anaerobic digestion process. At steady-state, the determination of the kinetic coefficient (K), the endogenous decay coefficient (Kd), the maximum growth rate of microorganisms (μmax), the growth yield coefficient (Y), ultimate methane yield (Bo), maximum utilization rate constant Umax and the saturation constant (KB) in the model were calculated to be 0.046 g/g COD, 0.083 (d¯¹), 0.117 (d-¹), 0.357 g/g, 0.516 (L CH4/gCODadded), 18.51 (g/L/day) and 13.64 (g/L/day) respectively. The outcome of this study will help in simulation of anaerobic model to predict usable methane and good effluent quality during the treatment of industrial wastewater. Thus, this will protect the environment, conserve natural resources, saves time and reduce cost incur by the industries for the discharge of untreated or partially treated wastewater. It will also contribute to a sustainable long-term clean development mechanism for the optimization of the methane produced from anaerobic degradation of waste in a close system.
Keywords: Brewery wastewater, methane generation model, environment, anaerobic modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42073171 Performance of BLDC Motor under Kalman Filter Sensorless Drive
Authors: Yuri Boiko, Ci Lin, Iluju Kiringa, Tet Yeap
Abstract:
The performance of a permanent magnet brushless direct current (BLDC) motor controlled by the Kalman filter based position-sensorless drive is studied in terms of its dependence from the system’s parameters variations. The effects of the system’s parameters changes on the dynamic behavior of state variables are verified. Simulated is the closed loop control scheme with Kalman filter in the feedback line. Distinguished are two separate data sampling modes in analyzing feedback output from the BLDC motor: (1) equal angular separation and (2) equal time intervals. In case (1), the data are collected via equal intervals of rotor’s angular position i, i.e. keeping = const. In case (2), the data collection time points ti are separated by equal sampling time intervals t = const. Demonstrated are the effects of the parameters changes on the sensorless control flow, in particular, reduction of the instability torque ripples, switching spikes, and torque load balancing. It is specifically shown that an efficient suppression of commutation induced instability torque ripples is an achievable selection of the sampling rate in the Kalman filter settings above a certain critical value. The computational cost of such suppression is shown to be higher for the motors with lower induction values of the windings.
Keywords: BLDC motor, Kalman filter, sensorless drive, state variables, instability torque ripples reduction, sampling rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7303170 Exploring SL Writing and SL Sensitivity during Writing Tasks: Poor and Advanced Writing in a Context of Second Language Other than English
Authors: S. Figueiredo, M. Alves Martins, C. Silva, C. Simões
Abstract:
This study integrates a larger research empirical project that examines second language (SL) learners’ profiles and valid procedures to perform complete and diagnostic assessment in schools. 102 learners of Portuguese as a SL aged 7 and 17 years speakers of distinct home languages were assessed in several linguistic tasks. In this article, we focused on writing performance in the specific task of narrative essay composition. The written outputs were measured using the score in six components adapted from an English SL assessment context (Alberta Education): linguistic vocabulary, grammar, syntax, strategy, socio-linguistic, and discourse. The writing processes and strategies in Portuguese language used by different immigrant students were analysed to determine features and diversity of deficits on authentic texts performed by SL writers. Differentiated performance was based on the diversity of the following variables: grades, previous schooling, home language, instruction in first language, and exposure to Portuguese as Second Language. Indo-Aryan languages speakers showed low writing scores compared to their peers and the type of language and respective cognitive mapping (such as Mandarin and Arabic) was the predictor, not linguistic distance. Home language instruction should also be prominently considered in further research to understand specificities of cognitive academic profile in a Romance languages learning context. Additionally, this study also examined the teachers’ representations that will be here addressed to understand educational implications of second language teaching in psychological distress of different minorities in schools of specific host countries.Keywords: Second language, writing assessment, home language, immigrant students, Portuguese language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19583169 Analysis of Climatic Strategies in Designing the Residential Buildings in Cold Dry Climate of Tabriz Metropolis to Reduce Air Pollution in Urban Environment
Authors: Shahryar Shaghaghi G., Paria Violette Shakiba , Gholamreza Irani
Abstract:
Nowadays, the earth is countered with serious problem of air pollution. This problem has been started from the industrial revolution and has been faster in recent years, so that leads the earth to ecological and environmental disaster. One of its results is the global warming problem and its related increase in global temperature. The most important factors in air pollution especially in urban environments are Automobiles and residential buildings that are the biggest consumers of the fossil energies, so that if the residential buildings as a big part of the consumers of such energies reduce their consumption rate, the air pollution will be decreased. Since Metropolises are the main centers of air pollution in the world, assessment and analysis of efficient strategies in decreasing air pollution in such cities, can lead to the desirable and suitable results and can solve the problem at least in critical level. Tabriz city is one of the most important metropolises in North west of Iran that about two million people are living there. for its situation in cold dry climate, has a high rate of fossil energies consumption that make air pollution in its urban environment. These two factors, being both metropolis and in cold dry climate, make this article try to analyze the strategies of climatic design in old districts of the city and use them in new districts of the future. These strategies can be used in this city and other similar cities and pave the way to reduce energy consumption and related air pollution to save whole world.Keywords: Air pollution, Urban Environment, Metropolis, Residential building, Fossil energies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17853168 Transformer Life Enhancement Using Dynamic Switching of Second Harmonic Feature in IEDs
Authors: K. N. Dinesh Babu, P. K. Gargava
Abstract:
Energization of a transformer results in sudden flow of current which is an effect of core magnetization. This current will be dominated by the presence of second harmonic, which in turn is used to segregate fault and inrush current, thus guaranteeing proper operation of the relay. This additional security in the relay sometimes obstructs or delays differential protection in a specific scenario, when the 2nd harmonic content was present during a genuine fault. This kind of scenario can result in isolation of the transformer by Buchholz and pressure release valve (PRV) protection, which is acted when fault creates more damage in transformer. Such delays involve a huge impact on the insulation failure, and chances of repairing or rectifying fault of problem at site become very dismal. Sometimes this delay can cause fire in the transformer, and this situation becomes havoc for a sub-station. Such occurrences have been observed in field also when differential relay operation was delayed by 10-15 ms by second harmonic blocking in some specific conditions. These incidences have led to the need for an alternative solution to eradicate such unwarranted delay in operation in future. Modern numerical relay, called as intelligent electronic device (IED), is embedded with advanced protection features which permit higher flexibility and better provisions for tuning of protection logic and settings. Such flexibility in transformer protection IEDs, enables incorporation of alternative methods such as dynamic switching of second harmonic feature for blocking the differential protection with additional security. The analysis and precautionary measures carried out in this case, have been simulated and discussed in this paper to ensure that similar solutions can be adopted to inhibit analogous issues in future.
Keywords: Differential protection, intelligent electronic device (IED), 2nd harmonic, inrush inhibit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10483167 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach
Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.
Keywords: CO2 emissions, performance based design, optimization, sustainable design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18673166 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8933165 Response Surface Methodology Approach to Defining Ultrafiltration of Steepwater from Corn Starch Industry
Authors: Zita I. Šereš, Ljubica P. Dokić, Dragana M. Šoronja Simović, Cecilia Hodur, Zsuzsanna Laszlo, Ivana Nikolić, Nikola Maravić
Abstract:
In this work the concentration of steepwater from corn starch industry is monitored using ultrafiltration membrane. The aim was to examine the conditions of ultrafiltration of steepwater by applying the membrane of 2.5nm. The parameters that vary during the course of ultrafiltration, were the transmembrane pressure, flow rate, while the permeate flux and the dry matter content of permeate and retentate were the dependent parameter constantly monitored during the process. Experiments of ultrafiltration are conducted on the samples of steepwater, which were obtained from the starch wet milling plant „Jabuka“ Pancevo. The procedure of ultrafiltration on a single-channel 250mm lenght, with inner diameter of 6.8mm and outer diameter of 10mm membrane were carried on. The membrane is made of a-Al2O3 with TiO2 layer obtained from GEA (Germany). The experiments are carried out at a flow rate ranging from 100 to 200lh-1 and transmembrane pressure of 1-3 bars. During the experiments of steepwater ultrafiltration, the change of permeate flux, dry matter content of permeate and retentate, as well as the absorbance changes of the permeate and retentate were monitored. The experimental results showed that the maximum flux reaches about 40lm-2h-1. For responses obtained after experiments, a polynomial model of the second degree is established to evaluate and quantify the influence of the variables. The quadratic equitation fits with the experimental values, where the coefficient of determination for flux is 0.96. The dry matter content of the retentate is increased for about 6%, while the dry matter content of permeate was reduced for about 35-40%, respectively. During steepwater ultrafiltration in permeate stays 40% less dry matter compared to the feed.
Keywords: Ultrafiltration, steepwater, starch industry, ceramic membrane.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21363164 Accelerating GLA with an M-Tree
Authors: Olli Luoma, Johannes Tuikkala, Olli Nevalainen
Abstract:
In this paper, we propose a novel improvement for the generalized Lloyd Algorithm (GLA). Our algorithm makes use of an M-tree index built on the codebook which makes it possible to reduce the number of distance computations when the nearest code words are searched. Our method does not impose the use of any specific distance function, but works with any metric distance, making it more general than many other fast GLA variants. Finally, we present the positive results of our performance experiments.Keywords: Clustering, GLA, M-Tree, Vector Quantization .
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15243163 Development of Affordable and Reliable Diagnostic Tools to Record Vital Parameters for Improving Health Care in Low Resources Settings
Authors: Mannan Mridha, Usama Gazay, Kosovare V. Aslani, Hugo Linder, Alice Ravizza, Carmelo de Maria
Abstract:
In most developing countries, although the vast majority of the people are living in the rural areas, the qualified medical doctors are not available there. Health care workers and paramedics, called village doctors, informal healthcare providers, are largely responsible for the rural medical care. Mishaps due to wrong diagnosis and inappropriate medication have been causing serious suffering that is preventable. While innovators have created many devices, the vast majority of these technologies do not find applications to address the needs and conditions in low-resource settings. The primary motive is to address the acute lack of affordable medical technologies for the poor people in low-resource settings. A low cost smart medical device that is portable, battery operated and can be used at any point of care has been developed to detect breathing rate, electrocardiogram (ECG) and arterial pulse rate to improve diagnosis and monitoring of patients and thus improve care and safety. This simple and easy to use smart medical device can be used, managed and maintained effectively and safely by any health worker with some training. In order to empower the health workers and village doctors, our device is being further developed to integrate with ICT tools like smart phones and connect to the medical experts wherever available, to manage the serious health problems.
Keywords: Healthcare for low resources settings, health awareness education, improve patient care and safety, smart and affordable medical device.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8553162 The Meaning and Structure of Ecological Education of Biology Specialists in Kazakhstan
Authors: E. Tazabekova, B. Amirasheva, L. Amirasheva
Abstract:
This article examines the nature and structure of ecological education of biology specialists in Kazakhstan. Also characterizes the ecological education in high school and specific features in training of biology specialists.
Keywords: Ecological education, environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15123161 The Relationship between Motivation for Physical Activity and Level of Physical Activity over Time
Authors: Keyvan Molanorouzi, Selina Khoo, Tony Morris
Abstract:
In recent years, there has been a decline in physical activity among adults. Motivation has been shown to be a crucial factor in maintaining physical activity. The purpose of this study was to whether PA motives measured by the Physical Activity and Leisure Motivation Scale PALMS predicted the actual amount of PA at a later time to provide evidence for the construct validity of the PALMS. A quantitative, cross-sectional descriptive research design was employed. The Demographic Form, PALMS, and International Physical Activity Questionnaire Short form (IPAQ-S) questionnaires were used to assess motives and amount for physical activity in adults on two occasions. A sample of 489 male undergraduate students aged 18 to 25 years (mean ±SD; 22.30±8.13 years) took part in the study. Participants were divided into three types of activities, namely exercise, racquet sport, and team sports and female participants only took part in one type of activity, namely team sports. After 14 weeks, all 489 undergraduate students who had filled in the initial questionnaire (Occasion 1) received the questionnaire via email (Occasion 2). Of the 489 students, 378 males emailed back the completed questionnaire. The results showed that not only were pertinent sub-scales of PALMS positively related to amount of physical activity, but separate regression analyses showed the positive predictive effect of PALMS motives for amount of physical activity for each type of physical activity among participants. This study supported the construct validity of the PALMS by showing that the motives measured by PALMS did predict amount of PA. This information can be obtained to match people with specific sport or activity which in turn could potentially promote longer adherence to the specific activity.Keywords: Physical activity, motivation, the level of physical activity, types of physical activities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35983160 On Chvátal’s Conjecture for the Hamiltonicity of 1-Tough Graphs and Their Complements
Authors: Shin-Shin Kao, Yuan-Kang Shih, Hsun Su
Abstract:
In this paper, we show that the conjecture of Chv tal, which states that any 1-tough graph is either a Hamiltonian graph or its complement contains a specific graph denoted by F, does not hold in general. More precisely, it is true only for graphs with six or seven vertices, and is false for graphs with eight or more vertices. A theorem is derived as a correction for the conjecture.
Keywords: Complement, degree sum, Hamiltonian, tough.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6973159 Integrating Geographic Information into Diabetes Disease Management
Authors: Tsu-Yun Chiu, Tsung-Hsueh Lu, Tain-Junn Cheng
Abstract:
Background: Traditional chronic disease management did not pay attention to effects of geographic factors on the compliance of treatment regime, which resulted in geographic inequality in outcomes of chronic disease management. This study aims to examine the geographic distribution and clustering of quality indicators of diabetes care. Method: We first extracted address, demographic information and quality of care indicators (number of visits, complications, prescription and laboratory records) of patients with diabetes for 2014 from medical information system in a medical center in Tainan City, Taiwan, and the patients’ addresses were transformed into district- and village-level data. We then compared the differences of geographic distribution and clustering of quality of care indicators between districts and villages. Despite the descriptive results, rate ratios and 95% confidence intervals (CI) were estimated for indices of care in order to compare the quality of diabetes care among different areas. Results: A total of 23,588 patients with diabetes were extracted from the hospital data system; whereas 12,716 patients’ information and medical records were included to the following analysis. More than half of the subjects in this study were male and between 60-79 years old. Furthermore, the quality of diabetes care did indeed vary by geographical levels. Thru the smaller level, we could point out clustered areas more specifically. Fuguo Village (of Yongkang District) and Zhiyi Village (of Sinhua District) were found to be “hotspots” for nephropathy and cerebrovascular disease; while Wangliau Village and Erwang Village (of Yongkang District) would be “coldspots” for lowest proportion of ≥80% compliance to blood lipids examination. On the other hand, Yuping Village (in Anping District) was the area with the lowest proportion of ≥80% compliance to all laboratory examination. Conclusion: In spite of examining the geographic distribution, calculating rate ratios and their 95% CI could also be a useful and consistent method to test the association. This information is useful for health planners, diabetes case managers and other affiliate practitioners to organize care resources to the areas most needed.
Keywords: Geocoding, chronic disease management, quality of diabetes care, rate ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9973158 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation
Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez
Abstract:
With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).Keywords: Component carrier, carrier aggregation, LTE-Advanced, scheduling, spectrum management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5633157 Comparing Field Displacement History with Numerical Results to Estimate Geotechnical Parameters: Case Study of Arash-Esfandiar-Niayesh under Passing Tunnel, 2.5 Traffic Lane Tunnel, Tehran, Iran
Authors: A. Golshani, M. Gharizade Varnusefaderani, S. Majidian
Abstract:
Underground structures are of those structures that have uncertainty in design procedures. That is due to the complexity of soil condition around. Under passing tunnels are also such affected structures. Despite geotechnical site investigations, lots of uncertainties exist in soil properties due to unknown events. As results, it possibly causes conflicting settlements in numerical analysis with recorded values in the project. This paper aims to report a case study on a specific under passing tunnel constructed by New Austrian Tunnelling Method in Iran. The intended tunnel has an overburden of about 11.3m, the height of 12.2m and, the width of 14.4m with 2.5 traffic lane. The numerical modeling was developed by a 2D finite element program (PLAXIS Version 8). Comparing displacement histories at the ground surface during the entire installation of initial lining, the estimated surface settlement was about four times the field recorded one, which indicates that some local unknown events affect that value. Also, the displacement ratios were in a big difference between the numerical and field data. Consequently, running several numerical back analyses using laboratory and field tests data, the geotechnical parameters were accurately revised to match with the obtained monitoring data. Finally, it was found that usually the values of soil parameters are conservatively low-estimated up to 40 percent by typical engineering judgment. Additionally, it could be attributed to inappropriate constitutive models applied for the specific soil condition.
Keywords: NATM, surface displacement history, soil tests, monitoring data, numerical back-analysis, geotechnical parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7993156 A Neuroscience-Based Learning Technique: Framework and Application to STEM
Authors: Dante J. Dorantes-González, Aldrin Balsa-Yepes
Abstract:
Existing learning techniques such as problem-based learning, project-based learning, or case study learning are learning techniques that focus mainly on technical details, but give no specific guidelines on learner’s experience and emotional learning aspects such as arousal salience and valence, being emotional states important factors affecting engagement and retention. Some approaches involving emotion in educational settings, such as social and emotional learning, lack neuroscientific rigorousness and use of specific neurobiological mechanisms. On the other hand, neurobiology approaches lack educational applicability. And educational approaches mainly focus on cognitive aspects and disregard conditioning learning. First, authors start explaining the reasons why it is hard to learn thoughtfully, then they use the method of neurobiological mapping to track the main limbic system functions, such as the reward circuit, and its relations with perception, memories, motivations, sympathetic and parasympathetic reactions, and sensations, as well as the brain cortex. The authors conclude explaining the major finding: The mechanisms of nonconscious learning and the triggers that guarantee long-term memory potentiation. Afterward, the educational framework for practical application and the instructors’ guidelines are established. An implementation example in engineering education is given, namely, the study of tuned-mass dampers for earthquake oscillations attenuation in skyscrapers. This work represents an original learning technique based on nonconscious learning mechanisms to enhance long-term memories that complement existing cognitive learning methods.
Keywords: Emotion, emotion-enhanced memory, learning technique, STEM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10143155 Open Innovation Laboratory for Rapid Realization of Sensing, Smart and Sustainable Products (S3 Products) for Higher Education
Authors: J. Miranda, D. Chavarría-Barrientos, M. Ramírez-Cadena, M. E. Macías, P. Ponce, J. Noguez, R. Pérez-Rodríguez, P. K. Wright, A. Molina
Abstract:
Higher education methods need to evolve because the new generations of students are learning in different ways. One way is by adopting emergent technologies, new learning methods and promoting the maker movement. As a result, Tecnologico de Monterrey is developing Open Innovation Laboratories as an immediate response to educational challenges of the world. This paper presents an Open Innovation Laboratory for Rapid Realization of Sensing, Smart and Sustainable Products (S3 Products). The Open Innovation Laboratory is composed of a set of specific resources where students and teachers use them to provide solutions to current problems of priority sectors through the development of a new generation of products. This new generation of products considers the concepts Sensing, Smart, and Sustainable. The Open Innovation Laboratory has been implemented in different courses in the context of New Product Development (NPD) and Integrated Manufacturing Systems (IMS) at Tecnologico de Monterrey. The implementation consists of adapting this Open Innovation Laboratory within the course’s syllabus in combination with the implementation of specific methodologies for product development, learning methods (Active Learning and Blended Learning using Massive Open Online Courses MOOCs) and rapid product realization platforms. Using the concepts proposed it is possible to demonstrate that students can propose innovative and sustainable products, and demonstrate how the learning process could be improved using technological resources applied in the higher educational sector. Finally, examples of innovative S3 products developed at Tecnologico de Monterrey are presented.Keywords: Active learning, blended learning, maker movement, new product development, open innovation laboratory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12633154 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil
Authors: B. Mendonça, D. Sandwell
Abstract:
The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.Keywords: Ground subsidence, interferometric satellite aperture radar (InSAR), metropolitan region of Sao Paulo, water extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13973153 Optimal Image Representation for Linear Canonical Transform Multiplexing
Authors: Navdeep Goel, Salvador Gabarda
Abstract:
Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4 × 4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4 × 4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4 × 4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.Keywords: Chirp signals, Image multiplexing, Image transformation, Linear canonical transform, Polynomial approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21303152 Perceptual Framework for a Modern Left-Turn Collision Warning System
Authors: E. Dabbour, S. M. Easa
Abstract:
Most of the collision warning systems currently available in the automotive market are mainly designed to warn against imminent rear-end and lane-changing collisions. No collision warning system is commercially available to warn against imminent turning collisions at intersections, especially for left-turn collisions when a driver attempts to make a left-turn at either a signalized or non-signalized intersection, conflicting with the path of other approaching vehicles traveling on the opposite-direction traffic stream. One of the major factors that lead to left-turn collisions is the human error and misjudgment of the driver of the turning vehicle when perceiving the speed and acceleration of other vehicles traveling on the opposite-direction traffic stream; therefore, using a properly-designed collision warning system will likely reduce, or even eliminate, this type of collisions by reducing human error. This paper introduces perceptual framework for a proposed collision warning system that can detect imminent left-turn collisions at intersections. The system utilizes a commercially-available detection sensor (either a radar sensor or a laser detector) to detect approaching vehicles traveling on the opposite-direction traffic stream and calculate their speeds and acceleration rates to estimate the time-tocollision and compare that time to the time required for the turning vehicle to clear the intersection. When calculating the time required for the turning vehicle to clear the intersection, consideration is given to the perception-reaction time of the driver of the turning vehicle, which is the time required by the driver to perceive the message given by the warning system and react to it by engaging the throttle. A regression model was developed to estimate perception-reaction time based on age and gender of the driver of the host vehicle. Desired acceleration rate selected by the driver of the turning vehicle, when making the left-turn movement, is another human factor that is considered by the system. Another regression model was developed to estimate the acceleration rate selected by the driver of the turning vehicle based on driver-s age and gender as well as on the location and speed of the nearest approaching vehicle along with the maximum acceleration rate provided by the mechanical characteristics of the turning vehicle. By comparing time-to-collision with the time required for the turning vehicle to clear the intersection, the system displays a message to the driver of the turning vehicle when departure is safe. An application example is provided to illustrate the logic algorithm of the proposed system.Keywords: Collision warning systems, intelligent transportationsystems, vehicle safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20553151 The Application of Specialized Memory Manager in Interactive CAD Systems
Authors: Wei Song, Lian-he Yang
Abstract:
Interactive CAD systems have to allocate and deallocate memory frequently. Frequent memory allocation and deallocation can play a significant role in degrading application performance. An application may use memory in a very specific way and pay a performance penalty for functionality it does not need. We could counter that by developing specialized memory managers.Keywords: Interactive CAD systems, Specialized Memory Manager.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13233150 Effect of Fire Retardant Painting Product on Smoke Optical Density of Burning Natural Wood Samples
Authors: Abdullah N. Olimat, Ahmad S. Awad, Faisal M. AL-Ghathian
Abstract:
Natural wood is used in many applications in Jordan such as furniture, partitions constructions, and cupboards. Experimental work for smoke produced by the combustion of certain wood samples was studied. Smoke generated from burning of natural wood, is considered as a major cause of death in furniture fires. The critical parameter for life safety in fires is the available time for escape, so the visual obscuration due to smoke release during fire is taken into consideration. The effect of smoke, produced by burning of wood, depends on the amount of smoke released in case of fire. The amount of smoke production, apparently, affects the time available for the occupants to escape. To achieve the protection of life of building occupants during fire growth, fire retardant painting products are tested. The tested samples of natural wood include Beech, Ash, Beech Pine, and white Beech Pine. A smoke density chamber manufactured by fire testing technology has been used to perform measurement of smoke properties. The procedure of test was carried out according to the ISO-5659. A nonflammable vertical radiant heat flux of 25 kW/m2 is exposed to the wood samples in a horizontal orientation. The main objective of the current study is to carry out the experimental tests for samples of natural woods to evaluate the capability to escape in case of fire and the fire safety requirements. Specific optical density, transmittance, thermal conductivity, and mass loss are main measured parameters. Also, comparisons between samples with paint and with no paint are carried out between the selected samples of woods.Keywords: Optical density, specific optical density, transmittance, visibility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11033149 Universal Metadata Definition
Authors: Mohib ur Rehman, Mohammad Haseeb Anwer, Nadeem Iftikhar
Abstract:
The need to have standards has always been a priority of all the disciplines in the world. Today, standards such as XML and USB are trying to create a universal interface for their respective areas. The information regarding every family in the discipline addressed, must have a lot in common, known as Metadata. A lot of work has been done in specific domains such as IEEE LOM and MPEG-7 but they do not appeal to the universality of creating Metadata for all entities, where we take an entity (object) as, not restricted to Software Terms. This paper tries to address this problem of universal Metadata Definition which may lead to increase in precision of search.Keywords: Metadata, Standard, Universal, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17303148 Cubic Splines and Fourier Series Approach to Study Temperature Variation in Dermal Layers of Elliptical Shaped Human Limbs
Authors: Mamta Agrawal, Neeru Adlakha, K.R. Pardasani
Abstract:
An attempt has been made to develop a seminumerical model to study temperature variations in dermal layers of human limbs. The model has been developed for two dimensional steady state case. The human limb has been assumed to have elliptical cross section. The dermal region has been divided into three natural layers namely epidermis, dermis and subdermal tissues. The model incorporates the effect of important physiological parameters like blood mass flow rate, metabolic heat generation, and thermal conductivity of the tissues. The outer surface of the limb is exposed to the environment and it is assumed that heat loss takes place at the outer surface by conduction, convection, radiation, and evaporation. The temperature of inner core of the limb also varies at the lower atmospheric temperature. Appropriate boundary conditions have been framed based on the physical conditions of the problem. Cubic splines approach has been employed along radial direction and Fourier series along angular direction to obtain the solution. The numerical results have been computed for different values of eccentricity resembling with the elliptic cross section of the human limbs. The numerical results have been used to obtain the temperature profile and to study the relationships among the various physiological parameters.Keywords: Blood Mass Flow Rate, Metabolic Heat Generation, Fourier Series, Cubic splines and Thermal Conductivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18003147 Analysis of Supply Side Factors Affecting Bank Financing of Non-Oil Exports in Nigeria
Authors: Sama’ila Idi Ningi, Abubakar Yusuf Dutse
Abstract:
The banking sector poses a lot of problems in Nigeria in general and the non-oil export sector in particular. The banks' lack effectiveness in handling small, medium or long-term credit risk (lack of training of loan officers, lack of information on borrowers and absence of a reliable credit registry) results in non-oil exporters being burdened with high requirements, such as up to three years of financial statements, enough collateral to cover both the loan principal and interest (including a cash deposit that may be up to 30% of the loans' net present value), and to provide every detail of the international trade transaction in question. The stated problems triggered this research. Consequently, information on bank financing of non-oil exports was collected from 100 respondents from the 20 Deposit Money Banks (DMBs) in Nigeria. The data was analysed by the use of descriptive statistics correlation and regression. It is found that, Nigerian banks are participants in the financing of non-oil exports. Despite their participation, the rate of interest for credit extended to non-oil export is usually high, ranging between 15-20%. Small and medium sized non-oil export businesses lack the credit history for banks to judge them as reputable. Banks also consider the non-oil export sector very risky for investment. The banks actually do grant less credit than the exporters may require and therefore are not properly funded by banks. Banks grant very low volume of foreign currency loan in addition to, unfavorable exchange rate at which Naira is exchanged to the Dollar and other currencies in the country. This makes importation of inputs costly and negatively impacted on the non-oil export performance in Nigeria.
Keywords: Supply Side Factors, Bank Financing, Non-Oil Exports.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27113146 Islamic Finance: What Is the Outlook for Italy?
Authors: Paolo Pietro Biancone
Abstract:
The spread of Islamic financial instruments is an opportunity to offer integration for the immigrant population and to attract, through the specific products, the richness of sovereign funds from the "Arab" countries. However, it is important to consider the possibility of comparing a traditional finance model, which in recent times has given rise to many doubts, with an "alternative" finance model, where the ethical aspect arising from religious principles is very important.
Keywords: Banks, Europe, Islamic Finance, Italy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28733145 Towards a Systematic Evaluation of Web Design
Authors: Ivayla Trifonova, Naoum Jamous, Holger Schrödl
Abstract:
A good web design is a prerequisite for a successful business nowadays, especially since the internet is the most common way for people to inform themselves. Web design includes the optical composition, the structure, and the user guidance of websites. The importance of each website leads to the question if there is a way to measure its usefulness. The aim of this paper is to suggest a methodology for the evaluation of web design. The desired outcome is to have an evaluation that is concentrated on a specific website and its target group.
Keywords: Evaluation methodology, factor analysis, target group, web design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12663144 Effect of Segregation on the Reaction Rate of Sewage Sludge Pyrolysis in a Bubbling Fluidized Bed
Authors: A. Soria-Verdugo, A. Morato-Godino, L. M. García-Gutiérrez, N. García-Hernando
Abstract:
The evolution of the pyrolysis of sewage sludge in a fixed and a fluidized bed was analyzed using a novel measuring technique. This original measuring technique consists of installing the whole reactor over a precision scale, capable of measuring the mass of the complete reactor with enough precision to detect the mass released by the sewage sludge sample during its pyrolysis. The inert conditions required for the pyrolysis process were obtained supplying the bed with a nitrogen flowrate, and the bed temperature was adjusted to either 500 ºC or 600 ºC using a group of three electric resistors. The sewage sludge sample was supplied through the top of the bed in a batch of 10 g. The measurement of the mass released by the sewage sludge sample was employed to determine the evolution of the reaction rate during the pyrolysis, the total amount of volatile matter released, and the pyrolysis time. The pyrolysis tests of sewage sludge in the fluidized bed were conducted using two different bed materials of the same size but different densities: silica sand and sepiolite particles. The higher density of silica sand particles induces a flotsam behavior for the sewage sludge particles which move close to the bed surface. In contrast, the lower density of sepiolite produces a neutrally-buoyant behavior for the sewage sludge particles, which shows a proper circulation throughout the whole bed in this case. The analysis of the evolution of the pyrolysis process in both fluidized beds show that the pyrolysis is faster when buoyancy effects are negligible, i.e. in the bed conformed by sepiolite particles. Moreover, sepiolite was found to show an absorbent capability for the volatile matter released during the pyrolysis of sewage sludge.
Keywords: Bubbling fluidized bed, pyrolysis time, segregation effects, sewage sludge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11283143 A CFD Study of Turbulent Convective Heat Transfer Enhancement in Circular Pipeflow
Authors: Perumal Kumar, Rajamohan Ganesan
Abstract:
Addition of milli or micro sized particles to the heat transfer fluid is one of the many techniques employed for improving heat transfer rate. Though this looks simple, this method has practical problems such as high pressure loss, clogging and erosion of the material of construction. These problems can be overcome by using nanofluids, which is a dispersion of nanosized particles in a base fluid. Nanoparticles increase the thermal conductivity of the base fluid manifold which in turn increases the heat transfer rate. Nanoparticles also increase the viscosity of the basefluid resulting in higher pressure drop for the nanofluid compared to the base fluid. So it is imperative that the Reynolds number (Re) and the volume fraction have to be optimum for better thermal hydraulic effectiveness. In this work, the heat transfer enhancement using aluminium oxide nanofluid using low and high volume fraction nanofluids in turbulent pipe flow with constant wall temperature has been studied by computational fluid dynamic modeling of the nanofluid flow adopting the single phase approach. Nanofluid, up till a volume fraction of 1% is found to be an effective heat transfer enhancement technique. The Nusselt number (Nu) and friction factor predictions for the low volume fractions (i.e. 0.02%, 0.1 and 0.5%) agree very well with the experimental values of Sundar and Sharma (2010). While, predictions for the high volume fraction nanofluids (i.e. 1%, 4% and 6%) are found to have reasonable agreement with both experimental and numerical results available in the literature. So the computationally inexpensive single phase approach can be used for heat transfer and pressure drop prediction of new nanofluids.Keywords: Heat transfer intensification, nanofluid, CFD, friction factor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2875