Search results for: input output linearization
1060 Optimum Performance of the Gas Turbine Power Plant Using Adaptive Neuro-Fuzzy Inference System and Statistical Analysis
Authors: Thamir K. Ibrahim, M. M. Rahman, Marwah Noori Mohammed
Abstract:
This study deals with modeling and performance enhancements of a gas-turbine combined cycle power plant. A clean and safe energy is the greatest challenges to meet the requirements of the green environment. These requirements have given way the long-time governing authority of steam turbine (ST) in the world power generation, and the gas turbine (GT) will replace it. Therefore, it is necessary to predict the characteristics of the GT system and optimize its operating strategy by developing a simulation system. The integrated model and simulation code for exploiting the performance of gas turbine power plant are developed utilizing MATLAB code. The performance code for heavy-duty GT and CCGT power plants are validated with the real power plant of Baiji GT and MARAFIQ CCGT plants the results have been satisfactory. A new technology of correlation was considered for all types of simulation data; whose coefficient of determination (R2) was calculated as 0.9825. Some of the latest launched correlations were checked on the Baiji GT plant and apply error analysis. The GT performance was judged by particular parameters opted from the simulation model and also utilized Adaptive Neuro-Fuzzy System (ANFIS) an advanced new optimization technology. The best thermal efficiency and power output attained were about 56% and 345MW respectively. Thus, the operation conditions and ambient temperature are strongly influenced on the overall performance of the GT. The optimum efficiency and power are found at higher turbine inlet temperatures. It can be comprehended that the developed models are powerful tools for estimating the overall performance of the GT plants.Keywords: gas turbine, optimization, ANFIS, performance, operating conditions
Procedia PDF Downloads 4251059 Design of Demand Pacemaker Using an Embedded Controller
Authors: C. Bala Prashanth Reddy, B. Abhinay, C. Sreekar, D. V. Shobhana Priscilla
Abstract:
The project aims in designing an emergency pacemaker which is capable of giving shocks to a human heart which has stopped working suddenly. A pacemaker is a machine commonly used by cardiologists. This machine is used in order to shock a human’s heart back into usage. The way the heart works is that there are small cells called pacemakers sending electrical pulses to cardiac muscles that tell the heart when to pump blood. When these electrical pulses stop, the heart stops beating. When this happens, a pacemaker is used to shock the heart muscles and the pacemakers back into action. The way this is achieved is by rubbing the two panels of the pacemaker together to create an adequate electrical current, and then the heart gets back to the normal state. The project aims in designing a system which is capable of continuously displaying the heart beat and blood pressure of a person on LCD. The concerned doctor gets the heart beat and also the blood pressure details continuously through the GSM Modem in the form of SMS alerts. In case of abnormal condition, the doctor sends message format regarding the amount of electric shock needed. Automatically the microcontroller gives the input to the pacemaker which in turn gives the shock to the patient. Heart beat monitor and display system is a portable and a best replacement for the old model stethoscope which is less efficient. The heart beat rate is calculated manually using stethoscope where the probability of error is high because the heart beat rate lies in the range of 70 to 90 per minute whose occurrence is less than 1 sec, so this device can be considered as a very good alternative instead of a stethoscope.Keywords: missing R wave, PWM, demand pacemaker, heart
Procedia PDF Downloads 4821058 The Confiscation of Ill-Gotten Gains in Pollution: The Taiwan Experience and the Interaction between Economic Analysis of Law and Environmental Economics Perspectives
Authors: Chiang-Lead Woo
Abstract:
In reply to serious environmental problems, the Taiwan government quickly adjusted some articles to suit the needs of environmental protection recently, such as the amendment to article 190-1 of the Taiwan Criminal Code. The transfer of legislation comes as an improvement which canceled the limitation of ‘endangering public safety’. At the same time, the article 190-1 goes from accumulative concrete offense to abstract crime of danger. Thus, the public looks forward to whether environmental crime following the imposition of fines or penalties works efficiently in anti-pollution by the deterrent effects. However, according to the addition to article 38-2 of the Taiwan Criminal Code, the confiscation system seems controversial legislation to restrain ill-gotten gains. Most prior studies focused on comparisons with the Administrative Penalty Law and the Criminal Code in environmental issue in Taiwan; recently, more and more studies emphasize calculations on ill-gotten gains. Hence, this paper try to examine the deterrent effect in environmental crime by economic analysis of law and environmental economics perspective. This analysis shows that only if there is an extremely high probability (equal to 100 percent) of an environmental crime case being prosecuted criminally by Taiwan Environmental Protection Agency, the deterrent effects will work. Therefore, this paper suggests deliberating the confiscation system from supplementing the System of Environmental and Economic Accounting, reasonable deterrent fines, input management, real-time system for detection of pollution, and whistleblower system, environmental education, and modernization of law.Keywords: confiscation, ecosystem services, environmental crime, ill-gotten gains, the deterrent effect, the system of environmental and economic accounting
Procedia PDF Downloads 1701057 Urban Neighborhood Center Location Evaluating Method Based On UNA the GIS Spatial Analysis Tools: Kerman's Neighborhood in Tehran Case
Authors: Sepideh Jabbari Behnam, Shadabeh Gashtasbi Iraei, Elnaz Mohsenin, MohammadAli Aghajani
Abstract:
Urban neighborhoods, as important urban forming cells, play a key role in creating urban texture and integrated form. Nowadays, most of neighborhood divisions are based on urban management systems but without considering social issues and the other aspects of urban life. This can cause problems such as providing inappropriate services for city dwellers, the loss of local identity and etc. In this regard for regenerating of such neighborhoods, it is essential to locate neighborhood centers with appropriate access and services for all residents. The main objective of this article is reaching to the location of neighborhood centers in a way that, most of issues relating to the physical features (such as the form of access network and texture permeability and etc.) and other qualities such as land uses, densities and social and economic features can be done simultaneously. This paper attempts to use methods of spatial analysis in order to surveying spatial structure and space syntax of urban textures and Urban Network Analysis Systems. This can be done by one of GIS toolbars which is named UNA (Urban Network Analysis) with the use of its five functions (include: Reach, Betweenness, Gravity, Closeness, Straightness).These functions were written according to space syntax theory and offer its relating output. This paper tries to locate and evaluate the optimal location of neighborhood centers in order to create local centers. This is done through weighing of each of these functions and taking into account of spatial features.Keywords: evaluate optimal location, Local centers, location of neighborhood centers, Spatial analysis, Urban network
Procedia PDF Downloads 4631056 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark
Authors: B. Elshafei, X. Mao
Abstract:
The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.Keywords: data fusion, Gaussian process regression, signal denoise, temporal extrapolation
Procedia PDF Downloads 1361055 Research on Characteristics and Inventory Planning Counter-Measure of Mature Industrial Zones in the Background of China's New Normal
Authors: Dong Chen, Han Song, Tingting Wei
Abstract:
Industrial zones have made significant contributions to the economic development of Chinese urban areas for decades. In the background of China's New Normal, numbers of mature industrial zones are stepping into a new stage of inventory development instead of increment development. The aim of this study is to discover new characteristics and problems and corresponding inventory planning guidance of mature industrial zones. A case of Yangzhou Hi-Tech Industrial Development Zone is reported in this study. Based on a historical analysis and data analysis of land-use, it is found that land-use of the zone is near saturation and signs of land updating have begun to appear. It is observed that the zone is facing problems including disorder of land development, low economic productivity and single function. Through the data of economic output, tax contribution, industrial category, industry life cycle and environmental influence, a comprehensive assessment based on two dimensions, economic benefits and industrial matchup, is made upon every parcel in the zone. According to the assessment, the zone is divided into spatial units of the update with specific planning guidance. It comes to a conclusion as four directions of inventory planning guidance in mature industrial zones: moving industries with poor economic benefit and negative environmental influence, adding urban function and new industrial function to the zone, optimizing the function of important space, and restricting the mass layout of the real estate industry to provide space for industrial upgrading.Keywords: China's new normal, mature industrial zones, land-use, inventory planning
Procedia PDF Downloads 4521054 Role of Basic Health Units in Provision of Primary Health Services in District Swabi
Authors: Naila Awan, Shahrukh Inam
Abstract:
This study was conducted to highlight the role of basic health units in district Swabi, which provides primary health services to the people of district Swabi having four tehsils. Tehsil Swabi was selected purposively for the study. Three villages were purposively selected from district Swabi. A sum of 110 respondents was randomly selected for interview i.e., 27 from Botakaa, 39 from Gulatee, and 44 from Darra Cham, using proportion allocation sampling technique. A pretested and well-designed interview schedule was used to collect as per the objective and Chi square test was applied to find an association between the quality of medicines and health improvement. The output of the test shows that the government was doing its best and providing enough facilities to the individuals at the healthcare units, and they were utilizing them. These resources were easily accessible to the people of the community. Medicines provided by the government were of good quality and quantity. There were also school health sessions and community health sessions (SHS/CHS) to deliver useful information and awareness regarding health problems and diseases were conducted. The staff of the BHU was present at work time and was performing their duties. The respondents seemed satisfied with their behavior and the duty of the staff. However, there were no emergency resources existing at the BHU after the working hours of the medical staff. It is recommended that government should provide appropriate quantity and quality of medicines to the basic health units so that these healthcare units don’t have to face any shortages regarding medicines at the end of the month. In addition, laboratory and blood testing facilities need to be provided in the basic health units, and also the infrastructure should be made suitable, satisfactory, and more functional.Keywords: community health session, basic health units, outpatient department, tuberculosis
Procedia PDF Downloads 831053 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis
Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan
Abstract:
Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.Keywords: carbon dioxide, emission modeling, light rail, microscopic model, traffic flow
Procedia PDF Downloads 1431052 Setting Uncertainty Conditions Using Singular Values for Repetitive Control in State Feedback
Authors: Muhammad A. Alsubaie, Mubarak K. H. Alhajri, Tarek S. Altowaim
Abstract:
A repetitive controller designed to accommodate periodic disturbances via state feedback is discussed. Periodic disturbances can be represented by a time delay model in a positive feedback loop acting on system output. A direct use of the small gain theorem solves the periodic disturbances problem via 1) isolating the delay model, 2) finding the overall system representation around the delay model and 3) designing a feedback controller that assures overall system stability and tracking error convergence. This paper addresses uncertainty conditions for the repetitive controller designed in state feedback in either past error feedforward or current error feedback using singular values. The uncertainty investigation is based on the overall system found and the stability condition associated with it; depending on the scheme used, to set an upper/lower limit weighting parameter. This creates a region that should not be exceeded in selecting the weighting parameter which in turns assures performance improvement against system uncertainty. Repetitive control problem can be described in lifted form. This allows the usage of singular values principle in setting the range for the weighting parameter selection. The Simulation results obtained show a tracking error convergence against dynamic system perturbation if the weighting parameter chosen is within the range obtained. Simulation results also show the advantage of weighting parameter usage compared to the case where it is omitted.Keywords: model mismatch, repetitive control, singular values, state feedback
Procedia PDF Downloads 1551051 Waste-Based Surface Modification to Enhance Corrosion Resistance of Aluminium Bronze Alloy
Authors: Wilson Handoko, Farshid Pahlevani, Isha Singla, Himanish Kumar, Veena Sahajwalla
Abstract:
Aluminium bronze alloys are well known for their superior abrasion, tensile strength and non-magnetic properties, due to the co-presence of iron (Fe) and aluminium (Al) as alloying elements and have been commonly used in many industrial applications. However, continuous exposure to the marine environment will accelerate the risk of a tendency to Al bronze alloys parts failures. Although a higher level of corrosion resistance properties can be achieved by modifying its elemental composition, it will come at a price through the complex manufacturing process and increases the risk of reducing the ductility of Al bronze alloy. In this research, the use of ironmaking slag and waste plastic as the input source for surface modification of Al bronze alloy was implemented. Microstructural analysis conducted using polarised light microscopy and scanning electron microscopy (SEM) that is equipped with energy dispersive spectroscopy (EDS). An electrochemical corrosion test was carried out through Tafel polarisation method and calculation of protection efficiency against the base-material was determined. Results have indicated that uniform modified surface which is as the result of selective diffusion process, has enhanced corrosion resistance properties up to 12.67%. This approach has opened a new opportunity to access various industrial utilisations in commercial scale through minimising the dependency on natural resources by transforming waste sources into the protective coating in environmentally friendly and cost-effective ways.Keywords: aluminium bronze, waste-based surface modification, tafel polarisation, corrosion resistance
Procedia PDF Downloads 2361050 An Improved Total Variation Regularization Method for Denoising Magnetocardiography
Authors: Yanping Liao, Congcong He, Ruigang Zhao
Abstract:
The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation
Procedia PDF Downloads 1531049 Query Task Modulator: A Computerized Experimentation System to Study Media-Multitasking Behavior
Authors: Premjit K. Sanjram, Gagan Jakhotiya, Apoorv Goyal, Shanu Shukla
Abstract:
In psychological research, laboratory experiments often face the trade-off issue between experimental control and mundane realism. With the advent of Immersive Virtual Environment Technology (IVET), this issue seems to be at bay. However there is a growing challenge within the IVET itself to design and develop system or software that captures the psychological phenomenon of everyday lives. One such phenomena that is of growing interest is ‘media-multitasking’ To aid laboratory researches in media-multitasking this paper introduces Query Task Modulator (QTM), a computerized experimentation system to study media-multitasking behavior in a controlled laboratory environment. The system provides a computerized platform in conducting an experiment for experimenters to study media-multitasking in which participants will be involved in a query task. The system has Instant Messaging, E-mail, and Voice Call features. The answers to queries are provided on the left hand side information panel where participants have to search for it and feed the information in the respective communication media blocks as fast as possible. On the whole the system will collect multitasking behavioral data. To analyze performance there is a separate output table that records the reaction times and responses of the participants individually. Information panel and all the media blocks will appear on a single window in order to ensure multi-modality feature in media-multitasking and equal emphasis on all the tasks (thus avoiding prioritization to a particular task). The paper discusses the development of QTM in the light of current techniques of studying media-multitasking.Keywords: experimentation system, human performance, media-multitasking, query-task
Procedia PDF Downloads 5571048 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 631047 Convolutional Neural Networks-Optimized Text Recognition with Binary Embeddings for Arabic Expiry Date Recognition
Authors: Mohamed Lotfy, Ghada Soliman
Abstract:
Recognizing Arabic dot-matrix digits is a challenging problem due to the unique characteristics of dot-matrix fonts, such as irregular dot spacing and varying dot sizes. This paper presents an approach for recognizing Arabic digits printed in dot matrix format. The proposed model is based on Convolutional Neural Networks (CNN) that take the dot matrix as input and generate embeddings that are rounded to generate binary representations of the digits. The binary embeddings are then used to perform Optical Character Recognition (OCR) on the digit images. To overcome the challenge of the limited availability of dotted Arabic expiration date images, we developed a True Type Font (TTF) for generating synthetic images of Arabic dot-matrix characters. The model was trained on a synthetic dataset of 3287 images and 658 synthetic images for testing, representing realistic expiration dates from 2019 to 2027 in the format of yyyy/mm/dd. Our model achieved an accuracy of 98.94% on the expiry date recognition with Arabic dot matrix format using fewer parameters and less computational resources than traditional CNN-based models. By investigating and presenting our findings comprehensively, we aim to contribute substantially to the field of OCR and pave the way for advancements in Arabic dot-matrix character recognition. Our proposed approach is not limited to Arabic dot matrix digit recognition but can also be extended to text recognition tasks, such as text classification and sentiment analysis.Keywords: computer vision, pattern recognition, optical character recognition, deep learning
Procedia PDF Downloads 941046 Effectiveness of Lowering the Water Table as a Mitigation Measure for Foundation Settlement in Liquefiable Soils Using 1-g Scale Shake Table Test
Authors: Kausar Alam, Mohammad Yazdi, Peiman Zogh, Ramin Motamed
Abstract:
An earthquake is an unpredictable natural disaster. It induces liquefaction, which causes considerable damage to the structure, life support, and piping systems because of ground settlement. As a result, people are incredibly concerned about how to resolve the situation. Previous researchers adopted different ground improvement techniques to reduce the settlement of the structure during earthquakes. This study evaluates the effectiveness of lowering the water table as a technique to mitigate foundation settlement in liquefiable soil. The performance will be evaluated based on foundation settlement and the reduction of excessive pore water pressure. In this study, a scaled model was prepared based on a full-scale shale table experiment conducted at the University of California, San Diego (UCSD). The model ground consists of three soil layers having a relative density of 55%, 45%, and 90%, respectively. A shallow foundation is seated over an unsaturated crust layer. After preparation of the model ground, the water table was measured to be at 45, 40, and 35 cm (from the bottom). Then, the input motions were applied for 10 seconds, with a peak acceleration of 0.25g and a constant frequency of 2.73 Hz. Based on the experimental results, the effectiveness of the lowering water table in reducing the foundation settlement and excess pore water pressure was evident. The foundation settlement was reduced from 50 mm to 5 mm. In addition, lowering the water table as a mitigation measure is a cost-effective way to decrease liquefaction-induced building settlement.Keywords: foundation settlement, ground water table, liquefaction, hake table test
Procedia PDF Downloads 1141045 Use of Two-Dimensional Hydraulics Modeling for Design of Erosion Remedy
Authors: Ayoub. El Bourtali, Abdessamed.Najine, Amrou Moussa. Benmoussa
Abstract:
One of the main goals of river engineering is river training, which is defined as controlling and predicting the behavior of a river. It is taking effective measurements to eliminate all related risks and thus improve the river system. In some rivers, the riverbed continues to erode and degrade; therefore, equilibrium will never be reached. Generally, river geometric characteristics and riverbed erosion analysis are some of the most complex but critical topics in river engineering and sediment hydraulics; riverbank erosion is the second answering process in hydrodynamics, which has a major impact on the ecological chain and socio-economic process. This study aims to integrate the new computer technology that can analyze erosion and hydraulic problems through computer simulation and modeling. Choosing the right model remains a difficult and sensitive job for field engineers. This paper makes use of the 5.0.4 version of the HEC-RAS model. The river section is adopted according to the gauged station and the proximity of the adjustment. In this work, we will demonstrate how 2D hydraulic modeling helped clarify the design and cover visuals to set up depth and velocities at riverbanks and throughout advanced structures. The hydrologic engineering center's-river analysis system (HEC-RAS) 2D model was used to create a hydraulic study of the erosion model. The geometric data were generated from the 12.5-meter x 12.5-meter resolution digital elevation model. In addition to showing eroded or overturned river sections, the model output also shows patterns of riverbank changes, which can help us reduce problems caused by erosion.Keywords: 2D hydraulics model, erosion, floodplain, hydrodynamic, HEC-RAS, riverbed erosion, river morphology, resolution digital data, sediment
Procedia PDF Downloads 1891044 The Immediate Effects of Thrust Manipulation for Thoracic Hyperkyphosis
Authors: Betul Taspinar, Eda O. Okur, Ismail Saracoglu, Ismail Okur, Ferruh Taspinar
Abstract:
Thoracic hyperkyphosis, is a well-known spinal phenomenon, refers to an excessive curvature (> 40 degrees) of the thoracic spine. The aim of this study was to explore the effectiveness of thrust manipulation on thoracic spine alignment. 31 young adults with hyperkyphosis diagnosed with Spinal Mouse® device were randomly assigned either thrust manipulation group (n=16, 11 female, 5 male) or sham manipulation group (n=15, 8 female, 7 male). Thrust and sham manipulations were performed by a blinded physiotherapist who is a certificated expert in musculoskeletal physiotherapy. Thoracic kyphosis degree was measured after the interventions via Spinal Mouse®. Wilcoxon test was used to analyse the data obtained before and after the manipulation for each group, whereas Mann-Whitney U test was used to compare the groups. The mean of baseline thoracic kyphosis degrees in thrust and sham groups were 50.69 o ± 7.73 and 48.27o ± 6.43, respectively. There was no statistically significant difference between groups in terms of initial thoracic kyphosis degrees (p=0.51). After the interventions, the mean of thoracic kyphosis degree in thrust and sham groups were measured as 44.06o ± 6.99 and 48.93o ± 6.57 respectively (p=0.03). There was no statistically significant difference between before and after interventions in sham group (p=0.33), while the mean of thoracic kyphosis degree in thrust group decreased significantly (p=0.00). Thrust manipulation can attenuate thoracic hyperkyphosis immediately in young adults by not using placebo effect. Manipulation might provide accurate proprioceptive (sensory) input to the spine joints and reduce kyphosis by restoring normal segment mobility. Therefore thoracic manipulation might be included in the physiotherapy programs to treat hyperkyphosis.Keywords: hyperkyphosis, manual therapy, spinal mouse, physiotherapy
Procedia PDF Downloads 3451043 Fluorescence Effect of Carbon Dots Modified with Silver Nanoparticles
Authors: Anna Piasek, Anna Szymkiewicz, Gabriela Wiktor, Jolanta Pulit-Prociak, Marcin Banach
Abstract:
Carbon dots (CDs) have great potential for application in many fields of science. They are characterized by fluorescent properties that can be manipulated. The nanomaterial has many advantages in addition to its unique properties. CDs may be obtained easily, and they undergo surface functionalization in a simple way. In addition, there is a wide range of raw materials that can be used for their synthesis. An interesting possibility is the use of numerous waste materials of natural origin. In the research presented here, the synthesis of CDs was carried out according to the principles of Green chemistry. Beet molasses was used as a natural raw material. It has a high sugar content. This makes it an excellent high-carbon precursor for obtaining CDs. To increase the fluorescence effect, we modified the surface of CDs with silver (Ag-CDs) nanoparticles. The process of obtaining CQD was based on the hydrothermal method by applying microwave radiation. Silver nanoparticles were formed via the chemical reduction method. The synthesis plans were performed on the Design of the Experimental method (DoE). Variable process parameters such as concentration of beet molasses, temperature and concentration of nanosilver were used in these syntheses. They affected the obtained properties and particle parameters. The Ag-CDs were analyzed by UV-vis spectroscopy. The fluorescence properties and selection of the appropriate excitation light wavelength were performed by spectrofluorimetry. Particle sizes were checked using the DLS method. The influence of the input parameters on the obtained results was also studied.Keywords: fluorescence, modification, nanosilver, molasses, Green chemistry, carbon dots
Procedia PDF Downloads 841042 Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals
Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty
Abstract:
A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs, and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine-learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient but not the magnitude. A neural network with two hidden layers were then used to learn the coefficient magnitudes along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.Keywords: quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction
Procedia PDF Downloads 1141041 Conceptual Solution and Thermal Analysis of the Final Cooling Process of Biscuits in One Confectionary Factory in Serbia
Authors: Duško Salemović, Aleksandar Dedić, Matilda Lazić, Dragan Halas
Abstract:
The paper presents the conceptual solution for the final cooling of the chocolate dressing of biscuits in one confectionary factory in Serbia. The proposed concept solution was derived from the desired technological process of final cooling of biscuits and the required process parameters that were to be achieved, and which were an integral part of the project task. The desired process parameters for achieving proper hardening and coating formation are the exchanged amount of heat in the time unit between the two media (air and chocolate dressing), the speed of air inside the tunnel cooler, and the surface of all biscuits in contact with the air. These parameters were calculated in the paper. The final cooling of chocolate dressing on biscuits could be optimized by changing process parameters and dimensions of the tunnel cooler and looking for the appropriate values for them. The accurate temperature predictions and fluid flow analysis could be conducted by using heat balance and flow balance equations, having in mind the theory of similarity. Furthermore, some parameters were adopted from previous technology processes, such as the inlet temperature of biscuits and input air temperature. A thermal calculation was carried out, and it was demonstrated that the percentage error between the contact surface of the air and the chocolate biscuit topping, which is obtained from the heat balance and geometrically through the proposed conceptual solution, does not exceed 0.67%, which is a very good agreement. This enabled the quality of the cooling process of chocolate dressing applied on the biscuit and the hardness of its coating.Keywords: chocolate dressing, air, cooling, heat balance
Procedia PDF Downloads 791040 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 3011039 Coherent All-Fiber and Polarization Maintaining Source for CO2 Range-Resolved Differential Absorption Lidar
Authors: Erwan Negre, Ewan J. O'Connor, Juha Toivonen
Abstract:
The need for CO2 monitoring technologies grows simultaneously with the worldwide concerns regarding environmental challenges. To that purpose, we developed a compact coherent all-fiber ranged-resolved Differential Absorption Lidar (RR-DIAL). It has been designed along a tunable 2x1fiber optic switch set to a frequency of 1 Hz between two Distributed FeedBack (DFB) lasers emitting in the continuous-wave mode at 1571.41 nm (absorption line of CO2) and 1571.25 nm (CO2 absorption-free line), with linewidth and tuning range of respectively 1 MHz and 3 nm over operating wavelength. A three stages amplification through Erbium and Erbium-Ytterbium doped fibers coupled to a Radio Frequency (RF) driven Acousto-Optic Modulator (AOM) generates 100 ns pulses at a repetition rate from 10 to 30 kHz with a peak power up to 2.5 kW and a spatial resolution of 15 m, allowing fast and highly resolved CO2 profiles. The same afocal collection system is used for the output of the laser source and the backscattered light which is then directed to a circulator before being mixed with the local oscillator for heterodyne detection. Packaged in an easily transportable box which also includes a server and a Field Programmable Gate Array (FPGA) card for on-line data processing and storing, our setup allows an effective and quick deployment for versatile in-situ analysis, whether it be vertical atmospheric monitoring, large field mapping or sequestration site continuous oversight. Setup operation and results from initial field measurements will be discussed.Keywords: CO2 profiles, coherent DIAL, in-situ atmospheric sensing, near infrared fiber source
Procedia PDF Downloads 1281038 Use of Chlorophyll Meters to Assess In-Season Wheat Nitrogen Fertilizer Requirements in the Southern San Joaquin Valley
Authors: Brian Marsh
Abstract:
Nitrogen fertilizer is the most used and often the most mismanaged nutrient input. Nitrogen management has tremendous implications on crop productivity, quality and environmental stewardship. Sufficient nitrogen is needed to optimum yield and quality. Soil and in-season plant tissue testing for nitrogen status are a time consuming and expensive process. Real time sensing of plant nitrogen status can be a useful tool in managing nitrogen inputs. The objectives of this project were to assess the reliability of remotely sensed non-destructive plant nitrogen measurements compared to wet chemistry data from sampled plant tissue, develop in-season nitrogen recommendations based on remotely sensed data for improved nitrogen use efficiency and assess the potential for determining yield and quality from remotely sensed data. Very good correlations were observed between early-season remotely sensed crop nitrogen status and plant nitrogen concentrations and subsequent in-season fertilizer recommendations. The transmittance/absorbance type meters gave the most accurate readings. Early in-season fertilizer recommendation would be to apply 40 kg nitrogen per hectare plus 16 kg nitrogen per hectare for each unit difference measured with the SPAD meter between the crop and reference area or 25 kg plus 13 kg per hectare for each unit difference measured with the CCM 200. Once the crop was sufficiently fertilized meter readings became inconclusive and were of no benefit for determining nitrogen status, silage yield and quality and grain yield and protein.Keywords: wheat, nitrogen fertilization, chlorophyll meter
Procedia PDF Downloads 3931037 Optimization Modeling of the Hybrid Antenna Array for the DoA Estimation
Authors: Somayeh Komeylian
Abstract:
The direction of arrival (DoA) estimation is the crucial aspect of the radar technologies for detecting and dividing several signal sources. In this scenario, the antenna array output modeling involves numerous parameters including noise samples, signal waveform, signal directions, signal number, and signal to noise ratio (SNR), and thereby the methods of the DoA estimation rely heavily on the generalization characteristic for establishing a large number of the training data sets. Hence, we have analogously represented the two different optimization models of the DoA estimation; (1) the implementation of the decision directed acyclic graph (DDAG) for the multiclass least-squares support vector machine (LS-SVM), and (2) the optimization method of the deep neural network (DNN) radial basis function (RBF). We have rigorously verified that the LS-SVM DDAG algorithm is capable of accurately classifying DoAs for the three classes. However, the accuracy and robustness of the DoA estimation are still highly sensitive to technological imperfections of the antenna arrays such as non-ideal array design and manufacture, array implementation, mutual coupling effect, and background radiation and thereby the method may fail in representing high precision for the DoA estimation. Therefore, this work has a further contribution on developing the DNN-RBF model for the DoA estimation for overcoming the limitations of the non-parametric and data-driven methods in terms of array imperfection and generalization. The numerical results of implementing the DNN-RBF model have confirmed the better performance of the DoA estimation compared with the LS-SVM algorithm. Consequently, we have analogously evaluated the performance of utilizing the two aforementioned optimization methods for the DoA estimation using the concept of the mean squared error (MSE).Keywords: DoA estimation, Adaptive antenna array, Deep Neural Network, LS-SVM optimization model, Radial basis function, and MSE
Procedia PDF Downloads 1001036 Developing a Framework to Aid Sustainable Assessment in Indian Buildings
Authors: P. Amarnath, Albert Thomas
Abstract:
Buildings qualify to be the major consumer of energy and resources thereby urging the designers, architects and policy makers to place a great deal of effort in achieving and implementing sustainable building strategies in construction. Green building rating systems help a great deal in this by measuring the effectiveness of these strategies along with the escalation of building performance in social, environmental and economic perspective, and construct new sustainable buildings. However, for a country like India, enormous population and its rapid rate of growth impose an increasing burden on the country's limited and continuously degrading natural resource base, which also includes the land available for construction. In general, the number of sustainable rated buildings in India is very minimal primarily due to the complexity and obstinate nature of the assessment systems/regulations that restrict the stakeholders and designers in proper implementation and utilization of these rating systems. This paper aims to introduce a data driven and user-friendly framework which cross compares the present prominent green building rating systems such as LEED, BREEAM, and GRIHA and subsequently help the users to rate their proposed building design as per the regulations of these assessment frameworks. This framework is validated using the input data collected from green buildings constructed globally. The proposed system has prospects to encourage the users to test the efficiency of various sustainable construction practices and thereby promote more sustainable buildings in the country.Keywords: BREEAM, GRIHA, green building rating systems, LEED, sustainable buildings
Procedia PDF Downloads 1381035 Implications of Fulani Herders/Farmers Conflict on the Socio-Economic Development of Nigeria (2000-2018)
Authors: Larry E. Udu, Joseph N. Edeh
Abstract:
Unarguably, the land is an indispensable factor of production and has been instrumental to numerous conflicts between crop farmers and herders in Nigeria. The conflicts pose a grave challenge to life and property, food security and ultimately to sustainable socio-economic development of the nation. The paper examines the causes of the Fulani herders/farmers conflicts, particularly in the Middle Belt; numerity of occurrences and extent of damage and their socio-economic implications. Content Analytical Approach was adopted as methodology wherein data was extensively drawn from the secondary source. Findings reveal that major causes of the conflict are attributable to violation of tradition and laws, trespass and cultural factors. Consequently, the numerity of attacks and level of fatality coupled with displacement of farmers, destruction of private and public facilities impacted negatively on farmers output with their attendant socio-economic implications on sustainable livelihood of the people and the nation at large. For instance, Mercy Corps (a Global Humanitarian Organization) in its research, 2013-2016 asserts that a loss of $14billion within 3 years was incurred and if the conflict were resolved, the average affected household could see increase income by at least 64 percent and potentially 210 percent or higher and that states affected by the conflicts lost an average of 47 percent taxes/IGR. The paper therefore recommends strict adherence to grazing laws; platform for dialogue bothering on compromises where necessary and encouragement of cattle farmers to build ranches for their cattle according to international standards.Keywords: conflict, farmers, herders, Nigeria, socio-economic implications
Procedia PDF Downloads 2071034 An Investigation of Customer Relationship Management of Tourism
Authors: Wanida Suwunniponth
Abstract:
This research paper aimed to developing a causal relationship model of success factors of customer relationship management of tourism in Thailand and to investigating relationships among the potential factors that facilitate the success of customer relationship management (CRM). The research was conducted in both quantitative and qualitative methods, by utilizing both questionnaire and in-depth interview. The questionnaire was used in collecting the data from 250 management staff in the hotels located within Bangkok area. Sampling techniques used in this research included cluster sampling according to the service quality and simple random sampling. The data input was analyzed by use of descriptive analysis and System Equation Model (SEM). The research findings demonstrated important factors accentuated by most respondents towards the success of CRM, which were organization, people, information technology and the process of CRM. Moreover, the customer relationship management of tourism business in Thailand was found to be successful at a very significant level. The hypothesis testing showed that the hypothesis was accepted, as the factors concerning with organization, people and information technology played an influence on the process and the success of customer relationship management, whereas the process of customer relationship management factor manipulated its success. The findings suggested that tourism business in Thailand with the implementation of customer relationship management should opt in improvement approach in terms of managerial structure, corporate culture building with customer- centralized approach accentuated, and investment of information technology and customer analysis, in order to capacitate higher efficiency of customer relationship management process that would result in customer satisfaction and retention of service.Keywords: customer relationship management, casual relationship model, tourism, Thailand
Procedia PDF Downloads 3301033 Application of Sustainable Agriculture Based on LEISA in Landscape Design of Integrated Farming
Authors: Eduwin Eko Franjaya, Andi Gunawan, Wahju Qamara Mugnisjah
Abstract:
Sustainable agriculture in the form of integrated farming with its LEISA (Low External Input Sustainable Agriculture) concept has brought a positive impact on agriculture development and ambient amelioration. But, most of the small farmers in Indonesia did not know how to put the concept of it and how to combine agricultural commodities on the site effectively and efficiently. This research has an aim to promote integrated farming (agrofisheries, etc) to the farmers by designing the agricultural landscape to become integrated farming landscape as medium of education for the farmers. The method used in this research is closely related with the rule of design in the landscape architecture science. The first step is inventarization for the existing condition on the research site. The second step is analysis. Then, the third step is concept-making that consists of base concept, design concept, and developing concept. The base concept used in this research is sustainable agriculture with LEISA. The concept design is related with activity base on site. The developing concept consists of space concept, circulation, vegetation and commodity, production system, etc. The fourth step as the final step is planning and design. This step produces site plan of integrated farming based on LEISA. The result of this research is site plan of integrated farming with its explanation, including the energy flow of integrated farming system on site and the production calendar of integrated farming commodities for education and agri-tourism opportunity. This research become the right way to promote the integrated farming and also as a medium for the farmers to learn and to develop it.Keywords: integrated farming, LEISA, planning and design, site plan
Procedia PDF Downloads 5121032 Query in Grammatical Forms and Corpus Error Analysis
Authors: Katerina Florou
Abstract:
Two decades after coined the term "learner corpora" as collections of texts created by foreign or second language learners across various language contexts, and some years following suggestion to incorporate "focusing on form" within a Task-Based Learning framework, this study aims to explore how learner corpora, whether annotated with errors or not, can facilitate a focus on form in an educational setting. Argues that analyzing linguistic form serves the purpose of enabling students to delve into language and gain an understanding of different facets of the foreign language. This same objective is applicable when analyzing learner corpora marked with errors or in their raw state, but in this scenario, the emphasis lies on identifying incorrect forms. Teachers should aim to address errors or gaps in the students' second language knowledge while they engage in a task. Building on this recommendation, we compared the written output of two student groups: the first group (G1) employed the focusing on form phase by studying a specific aspect of the Italian language, namely the past participle, through examples from native speakers and grammar rules; the second group (G2) focused on form by scrutinizing their own errors and comparing them with analogous examples from a native speaker corpus. In order to test our hypothesis, we created four learner corpora. The initial two were generated during the task phase, with one representing each group of students, while the remaining two were produced as a follow-up activity at the end of the lesson. The results of the first comparison indicated that students' exposure to their own errors can enhance their grasp of a grammatical element. The study is in its second stage and more results are to be announced.Keywords: Corpus interlanguage analysis, task based learning, Italian language as F1, learner corpora
Procedia PDF Downloads 531031 Relationship between Wave Velocities and Geo-Pressures in Shallow Libyan Carbonate Reservoir
Authors: Tarek Sabri Duzan
Abstract:
Knowledge of the magnitude of Geo-pressures (Pore, Fracture & Over-burden pressures) is vital especially during drilling, completions, stimulations, Enhance Oil Recovery. Many times problems, like lost circulation could have been avoided if techniques for calculating Geo-pressures had been employed in the well planning, mud weight plan, and casing design. In this paper, we focused on the relationships between Geo-pressures and wave velocities (P-Wave (Vp) and S-wave (Vs)) in shallow Libyan carbonate reservoir in the western part of the Sirte Basin (Dahra F-Area). The data used in this report was collected from four new wells recently drilled. Those wells were scattered throughout the interested reservoir as shown in figure-1. The data used in this work are bulk density, Formation Mult -Tester (FMT) results and Acoustic wave velocities. Furthermore, Eaton Method is the most common equation used in the world, therefore this equation has been used to calculate Fracture pressure for all wells using dynamic Poisson ratio calculated by using acoustic wave velocities, FMT results for pore pressure, Overburden pressure estimated by using bulk density. Upon data analysis, it has been found that there is a linear relationship between Geo-pressures (Pore, Fracture & Over-Burden pressures) and wave velocities ratio (Vp/Vs). However, the relationship was not clear in the high-pressure area, as shown in figure-10. Therefore, it is recommended to use the output relationship utilizing the new seismic data for shallow carbonate reservoir to predict the Geo-pressures for future oil operations. More data can be collected from the high-pressure zone to investigate more about this area.Keywords: bulk density, formation mult-tester (FMT) results, acoustic wave, carbonate shalow reservoir, d/jfield velocities
Procedia PDF Downloads 287