Search results for: intelligent systems patient monitoring
2831 Comparative Study of IC and Perturb and Observe Method of MPPT Algorithm for Grid Connected PV Module
Authors: Arvind Kumar, Manoj Kumar, Dattatraya H. Nagaraj, Amanpreet Singh, Jayanthi Prattapati
Abstract:
The purpose of this paper is to study and compare two maximum power point tracking (MPPT) algorithms in a photovoltaic simulation system and also show a simulation study of maximum power point tracking (MPPT) for photovoltaic systems using perturb and observe algorithm and Incremental conductance algorithm. Maximum power point tracking (MPPT) plays an important role in photovoltaic systems because it maximize the power output from a PV system for a given set of conditions, and therefore maximize the array efficiency and minimize the overall system cost. Since the maximum power point (MPP) varies, based on the irradiation and cell temperature, appropriate algorithms must be utilized to track the (MPP) and maintain the operation of the system in it. MATLAB/Simulink is used to establish a model of photovoltaic system with (MPPT) function. This system is developed by combining the models established of solar PV module and DC-DC Boost converter. The system is simulated under different climate conditions. Simulation results show that the photovoltaic simulation system can track the maximum power point accurately.Keywords: Incremental conductance Algorithm, Perturb and Observe Algorithm, Photovoltaic System and Simulation Results.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12582830 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data
Authors: N. Borjalilu, P. Rabiei, A. Enjoo
Abstract:
Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.Keywords: F-TOPSIS, fuzzy set, FDM, flight safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8872829 Perceptions toward Adopting Virtual Reality as a Learning Aid in Information Technology
Authors: S. Alfalah, J. Falah, T. Alfalah, M. Elfalah, O. Falah
Abstract:
The field of education is an ever-evolving area constantly enriched by newly discovered techniques provided by active research in all areas of technologies. The recent years have witnessed the introduction of a number of promising technologies and applications to enhance the teaching and learning experience. Virtual Reality (VR) applications are considered one of the evolving methods that have contributed to enhancing education in many fields. VR creates an artificial environment, using computer hardware and software, which is similar to the real world. This simulation provides a solution to improve the delivery of materials, which facilitates the teaching process by providing a useful aid to instructors, and enhances the learning experience by providing a beneficial learning aid. In order to assure future utilization of such systems, students’ perceptions were examined toward utilizing VR as an educational tool in the Faculty of Information Technology (IT) in The University of Jordan. A questionnaire was administered to IT undergraduates investigating students’ opinions about the potential opportunities that VR technology could offer and its implications as learning and teaching aid. The results confirmed the end users’ willingness to adopt VR systems as a learning aid. The result of this research forms a solid base for investing in a VR system for IT education.
Keywords: Education, information, technology, virtual reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11492828 Optical Fish Tracking in Fishways using Neural Networks
Authors: Alvaro Rodriguez, Maria Bermudez, Juan R. Rabuñal, Jeronimo Puertas
Abstract:
One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Keywords: Computer Vision, Neural Network, Fishway, Fish Trajectory, Tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20012827 Investigation of the Flow Characteristics in a Catalytic Muffler with Perforated Inlet Cone
Authors: Gyo Woo Lee, Man Young Kim
Abstract:
Emission regulations for diesel engines are being strengthened and it is impossible to meet the standards without exhaust after-treatment systems. Lack of the space in many diesel vehicles, however, make it difficult to design and install stand-alone catalytic converters such as DOC, DPF, and SCR in the vehicle exhaust systems. Accordingly, those have been installed inside the muffler to save the space, and referred to the catalytic muffler. However, that has complex internal structure with perforated plate and pipe for noise and monolithic catalyst for emission reduction. For this reason, flow uniformity and pressure drop, which affect efficiency of catalyst and engine performance, respectively, should be examined when the catalytic muffler is designed. In this work, therefore, the flow uniformity and pressure drop to improve the performance of the catalytic converter and the engine have been numerically investigated by changing various design parameters such as inlet shape, porosity, and outlet shape of the muffler using the three-dimensional turbulent flow of the incompressible, non-reacting, and steady state inside the catalytic muffler. Finally, it can be found that the shape, in which the muffler has perforated pipe inside the inlet part, has higher uniformity index and lower pressure drop than others considered in this work.
Keywords: Catalytic muffler, Perforated inlet cone, Catalysts, Perforated pipe, Flow uniformity, Pressure drop.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29022826 Protein Profiling in Alanine Aminotransferase Induced Patient cohort using Acetaminophen
Authors: Gry M, Bergström J, Lengquist J, Lindberg J, Drobin K, Schwenk J, Nilsson P, Schuppe-Koistinen I.
Abstract:
Sensitive and predictive DILI (Drug Induced Liver Injury) biomarkers are needed in drug R&D to improve early detection of hepatotoxicity. The discovery of DILI biomarkers that demonstrate the predictive power to identify individuals at risk to DILI would represent a major advance in the development of personalized healthcare approaches. In this healthy volunteer acetaminophen study (4g/day for 7 days, with 3 monitored nontreatment days before and 4 after), 450 serum samples from 32 subjects were analyzed using protein profiling by antibody suspension bead arrays. Multiparallel protein profiles were generated using a DILI target protein array with 300 antibodies, where the antibodies were selected based on previous literature findings of putative DILI biomarkers and a screening process using pre dose samples from the same cohort. Of the 32 subjects, 16 were found to develop an elevated ALT value (2Xbaseline, responders). Using the plasma profiling approach together with multivariate statistical analysis some novel findings linked to lipid metabolism were found and more important, endogenous protein profiles in baseline samples (prior to treatment) with predictive power for ALT elevations were identified.Keywords: DILI, Plasma profiling, PLSDA, Randomforest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13172825 The Management in Large Emergency Situations – A Best Practise Case Study based on GIS for Management of Evacuation
Authors: Ion Baş, Claudiu Zoicaş, Angela Ioniţâ
Abstract:
In most of the cases, natural disasters lead to the necessity of evacuating people. The quality of evacuation management is dramatically improved by the use of information provided by decision support systems, which become indispensable in case of large scale evacuation operations. This paper presents a best practice case study. In November 2007, officers from the Emergency Situations Inspectorate “Crisana" of Bihor County from Romania participated to a cross-border evacuation exercise, when 700 people have been evacuated from Netherlands to Belgium. One of the main objectives of the exercise was the test of four different decision support systems. Afterwards, based on that experience, software system called TEVAC (Trans Border Evacuation) has been developed “in house" by the experts of this institution. This original software system was successfully tested in September 2008, during the deployment of the international exercise EU-HUROMEX 2008, the scenario involving real evacuation of 200 persons from Hungary to Romania. Based on the lessons learned and results, starting from April 2009, the TEVAC software is used by all Emergency Situations Inspectorates all over Romania.Keywords: Emergency evacuation, Searching Features, TEVAC(Trans Border Evacuation) software system, User Interface Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15832824 A Six-Year Case Study Evaluating the Stakeholders’ Requirements and Satisfaction in Higher Educational Establishments
Authors: Ioannis I. Αngeli
Abstract:
Worldwide and mainly in the European Union, many standards, regulations, models and systems exists for the evaluation and identification of stakeholders’ requirements of individual universities and higher education (HE) in general. All systems are targeting to measure or evaluate the Universities’ Quality Assurance Systems and the services offered to the recipients of HE, mainly the students. Numerous surveys were conducted in the past either by each university or by organized bodies to identify the students’ satisfaction or to evaluate to what extent these requirements are fulfilled. In this paper, the main results of an ongoing 6-year joint research will be presented very briefly. This research deals with an in depth investigation of student’s satisfaction, students personal requirements, a cup analysis among these two parameters and compares different universities. Through this research an attempt will be made to address four very important questions in higher education establishments (HEE): (1) Are there any common requirements, parameters, good practices or questions that apply to a large number of universities that will assure that students’ requirements are fulfilled? (2) Up to what extent the individual programs of HEE fulfil the requirements of the stakeholders? (3) Are there any similarities on specific programs among European HEE? (4) To what extent the knowledge acquired in a specific course program is utilized or used in a specific country? For the execution of the research an internationally accepted questionnaire(s) was used to evaluate up to what extent the students’ requirements and satisfaction were fulfilled in 2012 and five years later (2017). Samples of students and or universities were taken from many European Universities. The questionnaires used, the sampling method and methodology adopted, as well as the comparison tables and results will be very valuable to any university that is willing to follow the same route and methodology or compare the results with their own HHE. Apart from the unique methodology, valuable results are demonstrated from the four case studies. There is a great difference between the student’s expectations or importance from what they are getting from their universities (in all parameters they are getting less). When there is a crisis or budget cut in HEE there is a direct impact to students. There are many differences on subjects taught in European universities.
Keywords: Quality in higher education, students’ requirements, education standards, student’s survey, stakeholder’s requirements, Mechanical Engineering courses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7832823 Computational Study of Blood Flow Analysis for Coronary Artery Disease
Authors: Radhe Tado, Ashish B. Deoghare, K. M. Pandey
Abstract:
The aim of this study is to estimate the effect of blood flow through the coronary artery in human heart so as to assess the coronary artery disease.Velocity, wall shear stress (WSS), strain rate and wall pressure distribution are some of the important hemodynamic parameters that are non-invasively assessed with computational fluid dynamics (CFD). These parameters are used to identify the mechanical factors responsible for the plaque progression and/or rupture in left coronary arteries (LCA) in coronary arteries.The initial step for CFD simulations was the construction of a geometrical model of the LCA. Patient specific artery model is constructed using computed tomography (CT) scan data with the help of MIMICS Research 19.0. For CFD analysis ANSYS FLUENT-14.5 is used.Hemodynamic parameters were quantified and flow patterns were visualized both in the absence and presence of coronary plaques. The wall pressure continuously decreased towards distal segments and showed pressure drops in stenotic segments. Areas of high WSS and high flow velocities were found adjacent to plaques deposition.
Keywords: Computational fluid dynamics, hemodynamics, velocity, strain rate, wall pressure, wall shear stress.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14782822 Simulation of PM10 Source Apportionment at An Urban Site in Southern Taiwan by a Gaussian Trajectory Model
Authors: Chien-Lung Chen, Jeng-Lin Tsai, Feng-Chao Chung, Su-Ching Kuo, Kuo-Hsin Tseng, Pei-Hsuan Kuo, Li-Ying Hsieh, Ying I. Tsai
Abstract:
This study applied the Gaussian trajectory transfer-coefficient model (GTx) to simulate the particulate matter concentrations and the source apportionments at Nanzih Air Quality Monitoring Station in southern Taiwan from November 2007 to February 2008. The correlation coefficient between the observed and the calculated daily PM10 concentrations is 0.5 and the absolute bias of the PM10 concentrations is 24%. The simulated PM10 concentrations matched well with the observed data. Although the emission rate of PM10 was dominated by area sources (58%), the results of source apportionments indicated that the primary sources for PM10 at Nanzih Station were point sources (42%), area sources (20%) and then upwind boundary concentration (14%). The obvious difference of PM10 source apportionment between episode and non-episode days was upwind boundary concentrations which contributed to 20% and 11% PM10 sources, respectively. The gas-particle conversion of secondary aerosol and long range transport played crucial roles on the PM10 contribution to a receptor.Keywords: back trajectory model, particulate matter, sourceapportionment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15982821 Resilient Machine Learning in the Nuclear Industry: Crack Detection as a Case Study
Authors: Anita Khadka, Gregory Epiphaniou, Carsten Maple
Abstract:
There is a dramatic surge in the adoption of Machine Learning (ML) techniques in many areas, including the nuclear industry (such as fault diagnosis and fuel management in nuclear power plants), autonomous systems (including self-driving vehicles), space systems (space debris recovery, for example), medical surgery, network intrusion detection, malware detection, to name a few. Artificial Intelligence (AI) has become a part of everyday modern human life. To date, the predominant focus has been developing underpinning ML algorithms that can improve accuracy, while factors such as resiliency and robustness of algorithms have been largely overlooked. If an adversarial attack is able to compromise the learning method or data, the consequences can be fatal, especially but not exclusively in safety-critical applications. In this paper, we present an in-depth analysis of five adversarial attacks and two defence methods on a crack detection ML model. Our analysis shows that it can be dangerous to adopt ML techniques without rigorous testing, since they may be vulnerable to adversarial attacks, especially in security-critical areas such as the nuclear industry. We observed that while the adopted defence methods can effectively defend against different attacks, none of them could protect against all five adversarial attacks entirely.
Keywords: Resilient Machine Learning, attacks, defences, nuclear industry, crack detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5012820 Automation System for Optimization of Electrical and Thermal Energy Production in Cogenerative Gas Power Plants
Authors: Ion Miciu
Abstract:
The system is made with main distributed components: First Level: Industrial Computers placed in Control Room (monitors thermal and electrical processes based on the data provided by the second level); Second Level: PLCs which collects data from process and transmits information on the first level; also takes commands from this level which are further, passed to execution elements from third level; Third Level: field elements consisting in 3 categories: data collecting elements; data transfer elements from the third level to the second; execution elements which take commands from the second level PLCs and executes them after which transmits the confirmation of execution to them. The purpose of the automatic functioning is the optimization of the co-generative electrical energy commissioning in the national energy system and the commissioning of thermal energy to the consumers. The integrated system treats the functioning of all the equipments and devices as a whole: Gas Turbine Units (GTU); MT 20kV Medium Voltage Station (MVS); 0,4 kV Low Voltage Station (LVS); Main Hot Water Boilers (MHW); Auxiliary Hot Water Boilers (AHW); Gas Compressor Unit (GCU); Thermal Agent Circulation Pumping Unit (TPU); Water Treating Station (WTS).Keywords: Automation System, Cogenerative Power Plant, Control, Monitoring, Real Time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19772819 The Techno-Economic and Environmental Assessments of Grid-Connected Photovoltaic Systems in Bhubaneswar, India
Authors: A. K. Pradhan, M. K. Mohanty, S. K. Kar
Abstract:
The power system utility has started to think about the green power technology in order to have an eco-friendly environment. The green power technology utilizes renewable energy sources for reduction of GHG emissions. Odisha state (India) is very rich in potential of renewable energy sources especially in solar energy (about 300 solar days), for installation of grid connected photovoltaic system. This paper focuses on the utilization of photovoltaic systems in an Institute building of Bhubaneswar city, Odisha. Different data like solar insolation (kW/m2/day), sunshine duration has been collected from metrological stations for Bhubaneswar city. The required electrical power and cost are calculated for daily load of 1.0 kW. The HOMER (Hybrid Optimization Model of Electric Renewable) software is used to estimate system size and its performance analysis. The simulation result shows that the cost of energy (COE) is $ 0.194/kWh, the Operating cost is $63/yr and the net present cost (NPC) is $3,917. The energy produced from PV array is 1,756kWh/yr and energy purchased from grid is 410kWh/yr. The AC primary load consumption is 1314 kWh/yr and the Grid sales are 746 kWh/yr. One battery is connected in parallel with 12V DC Bus and the usable nominal capacity 2.4 kWh with 9.6 h autonomy capacity.
Keywords: Economic assessment, HOMER, Optimization, Photovoltaic (PV), Renewable energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22622818 Defining a Pathway to Zero Energy Building: A Case Study on Retrofitting an Old Office Building into a Net Zero Energy Building for Hot-Humid Climate
Authors: Kwame B. O. Amoah
Abstract:
This paper focuses on retrofitting an old existing office building to a net-zero energy building (NZEB). An existing small office building in Melbourne, Florida, was chosen as a case study to integrate state-of-the-art design strategies and energy-efficient building systems to improve building performance and reduce energy consumption. The study aimed to explore possible ways to maximize energy savings and renewable energy generation sources to cover the building's remaining energy needs necessary to achieve net-zero energy goals. A series of retrofit options were reviewed and adopted with some significant additional decision considerations. Detailed processes and considerations leading to zero energy are well documented in this study, with lessons learned adequately outlined. Based on building energy simulations, multiple design considerations were investigated, such as emerging state-of-the-art technologies, material selection, improvements to the building envelope, optimization of the HVAC, lighting systems, and occupancy loads analysis, as well as the application of renewable energy sources. The comparative analysis of simulation results was used to determine how specific techniques led to energy saving and cost reductions. The research results indicate that this small office building can meet net-zero energy use after appropriate design manipulations and renewable energy sources.
Keywords: Energy consumption, building energy analysis, energy retrofits, energy-efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3402817 Exergetic and Sustainability Evaluation of a Building Heating System in Izmir, Turkey
Authors: Nurdan Yildirim, Arif Hepbasli
Abstract:
Heating, cooling and lighting appliances in buildings account for more than one third of the world’s primary energy demand. Therefore, main components of the building heating systems play an essential role in terms of energy consumption. In this context, efficient energy and exergy utilization in HVAC-R systems has been very essential, especially in developing energy policies towards increasing efficiencies. The main objective of the present study is to assess the performance of a family house with a volume of 326.7 m3 and a net floor area of 121 m2, located in the city of Izmir, Turkey in terms of energetic, exergetic and sustainability aspects. The indoor and exterior air temperatures are taken as 20°C and 1°C, respectively. In the analysis and assessment, various metrics (indices or indicators) such as exergetic efficiency, exergy flexibility ratio and sustainability index are utilized. Two heating options (Case 1: condensing boiler and Case 2: air heat pump) are considered for comparison purposes. The total heat loss rate of the family house is determined to be 3770.72 W. The overall energy efficiencies of the studied cases are calculated to be 49.4% for Case 1 and 54.7% for Case 2. The overall exergy efficiencies, the flexibility factor and the sustainability index of Cases 1 and 2 are computed to be around 3.3%, 0.17 and 1.034, respectively.
Keywords: Buildings, exergy, low exergy, sustainability, efficiency, heating, renewable energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20572816 Reliability Analysis of Press Unit using Vague Set
Authors: S. P. Sharma, Monica Rani
Abstract:
In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.
Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15272815 Analysis of Thermal Damping in Si Based Torsional Micromirrors
Authors: R. Resmi, M. R. Baiju
Abstract:
The thermal damping of a dynamic vibrating micromirror is an important factor affecting the design of MEMS based actuator systems. In the development process of new micromirror systems, assessing the extent of energy loss due to thermal damping accurately and predicting the performance of the system is very essential. In this paper, the depth of the thermal penetration layer at different eigenfrequencies and the temperature variation distributions surrounding a vibrating micromirror is analyzed. The thermal penetration depth corresponds to the thermal boundary layer in which energy is lost which is a measure of the thermal damping is found out. The energy is mainly dissipated in the thermal boundary layer and thickness of the layer is an important parameter. The detailed thermoacoustics is used to model the air domain surrounding the micromirror. The thickness of the boundary layer, temperature variations and thermal power dissipation are analyzed for a Si based torsional mode micromirror. It is found that thermal penetration depth decreases with eigenfrequency and hence operating the micromirror at higher frequencies is essential for reducing thermal damping. The temperature variations and thermal power dissipations at different eigenfrequencies are also analyzed. Both frequency-response and eigenfrequency analyses are done using COMSOL Multiphysics software.
Keywords: Eigen frequency analysis, micromirrors, thermal damping, thermoacoustic interactions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10602814 Discrete Polyphase Matched Filtering-based Soft Timing Estimation for Mobile Wireless Systems
Authors: Thomas O. Olwal, Michael A. van Wyk, Barend J. van Wyk
Abstract:
In this paper we present a soft timing phase estimation (STPE) method for wireless mobile receivers operating in low signal to noise ratios (SNRs). Discrete Polyphase Matched (DPM) filters, a Log-maximum a posterior probability (MAP) and/or a Soft-output Viterbi algorithm (SOVA) are combined to derive a new timing recovery (TR) scheme. We apply this scheme to wireless cellular communication system model that comprises of a raised cosine filter (RCF), a bit-interleaved turbo-coded multi-level modulation (BITMM) scheme and the channel is assumed to be memory-less. Furthermore, no clock signals are transmitted to the receiver contrary to the classical data aided (DA) models. This new model ensures that both the bandwidth and power of the communication system is conserved. However, the computational complexity of ideal turbo synchronization is increased by 50%. Several simulation tests on bit error rate (BER) and block error rate (BLER) versus low SNR reveal that the proposed iterative soft timing recovery (ISTR) scheme outperforms the conventional schemes.
Keywords: discrete polyphase matched filters, maximum likelihood estimators, soft timing phase estimation, wireless mobile systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16922813 Analysis of Event-related Response in Human Visual Cortex with fMRI
Authors: Ayesha Zaman, Tanvir Atahary, Shahida Rafiq
Abstract:
Functional Magnetic Resonance Imaging(fMRI) is a noninvasive imaging technique that measures the hemodynamic response related to neural activity in the human brain. Event-related functional magnetic resonance imaging (efMRI) is a form of functional Magnetic Resonance Imaging (fMRI) in which a series of fMRI images are time-locked to a stimulus presentation and averaged together over many trials. Again an event related potential (ERP) is a measured brain response that is directly the result of a thought or perception. Here the neuronal response of human visual cortex in normal healthy patients have been studied. The patients were asked to perform a visual three choice reaction task; from the relative response of each patient corresponding neuronal activity in visual cortex was imaged. The average number of neurons in the adult human primary visual cortex, in each hemisphere has been estimated at around 140 million. Statistical analysis of this experiment was done with SPM5(Statistical Parametric Mapping version 5) software. The result shows a robust design of imaging the neuronal activity of human visual cortex.Keywords: Echo Planner Imaging, Event related Response, General Linear Model, Visual Neuronal Response.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14572812 Curriculum Development of Successful Intelligence Promoting for Nursing Students
Authors: Saranya Chularee, Tawa Chularee
Abstract:
Successful intelligence (SI) is the integrated set of the ability needed to attain success in life, within individual-s sociocultural context. People are successfully intelligent by recognizing their strengths and weaknesses. They will find ways to strengthen their weakness and maintain their strength or even improve it. SI people can shape, select, and adapt to the environments by using balance of higher-ordered thinking abilities including; critical, creative, and applicative. Aims: The purposes of this study were to; 1) develop curriculum that promotes SI for nursing students, and 2) study the effectiveness of the curriculum development. Method: Research and Development was a method used for this study. The design was divided into two phases; 1) the curriculum development which composed of three steps (needs assessment, curriculum development and curriculum field trail), and 2) the curriculum implementation. In this phase, a pre-experimental research design (one group pretest-posttest design) was conducted. The sample composed of 49 sophomore nursing students of Boromarajonani College of Nursing, Surin, Thailand who enrolled in Nursing care of Health problem course I in 2011 academic year. Data were carefully collected using 4 instruments; 1) Modified essay questions test (MEQ) 2) Nursing Care Plan evaluation form 3) Group processing observation form (α = 0.74) and 4) Satisfied evaluation form of learning (α = 0.82). Data were analyzed using descriptive statistics and content analysis. Results: The results revealed that the sample had post-test average score of SI higher than pre-test average score (mean difference was 5.03, S.D. = 2.84). Fifty seven percentages of the sample passed the MEQ posttest at the criteria of 60 percentages. Students demonstrated the strategies of how to develop nursing care plan. Overall, students- satisfaction on teaching performance was at high level (mean = 4.35, S.D. = 0.46). Conclusion: This curriculum can promote the attribute of characteristic of SI person and was highly required to be continued.Keywords: Curriculum Development, Nursing Education, Successful Intelligence, Thinking ability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22102811 Highly Accurate Target Motion Compensation Using Entropy Function Minimization
Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani
Abstract:
One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.
Keywords: ATR, HRRP, motion compensation, SFW, TMP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6572810 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank
Authors: Jalal Haghighat Monfared, Zahra Akbari
Abstract:
Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.
Keywords: Business intelligence, business intelligence capability, decision making, decision quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13822809 Fundamental Theory of the Evolution Force: Gene Engineering utilizing Synthetic Evolution Artificial Intelligence
Authors: L. K. Davis
Abstract:
The effects of the evolution force are observable in nature at all structural levels ranging from small molecular systems to conversely enormous biospheric systems. However, the evolution force and work associated with formation of biological structures has yet to be described mathematically or theoretically. In addressing the conundrum, we consider evolution from a unique perspective and in doing so we introduce the “Fundamental Theory of the Evolution Force: FTEF”. We utilized synthetic evolution artificial intelligence (SYN-AI) to identify genomic building blocks and to engineer 14-3-3 ζ docking proteins by transforming gene sequences into time-based DNA codes derived from protein hierarchical structural levels. The aforementioned served as templates for random DNA hybridizations and genetic assembly. The application of hierarchical DNA codes allowed us to fast forward evolution, while dampening the effect of point mutations. Natural selection was performed at each hierarchical structural level and mutations screened using Blosum 80 mutation frequency-based algorithms. Notably, SYN-AI engineered a set of three architecturally conserved docking proteins that retained motion and vibrational dynamics of native Bos taurus 14-3-3 ζ.Keywords: 14-3-3 docking genes, synthetic protein design, time based DNA codes, writing DNA code from scratch.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6642808 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms
Authors: S. Nandagopalan, N. Pradeep
Abstract:
The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.Keywords: Active Contour, Bayesian, Echocardiographic image, Feature vector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17132807 MPSO based Model Order Formulation Technique for SISO Continuous Systems
Authors: S. N. Deepa, G. Sugumaran
Abstract:
This paper proposes a new version of the Particle Swarm Optimization (PSO) namely, Modified PSO (MPSO) for model order formulation of Single Input Single Output (SISO) linear time invariant continuous systems. In the General PSO, the movement of a particle is governed by three behaviors namely inertia, cognitive and social. The cognitive behavior helps the particle to remember its previous visited best position. In Modified PSO technique split the cognitive behavior into two sections like previous visited best position and also previous visited worst position. This modification helps the particle to search the target very effectively. MPSO approach is proposed to formulate the higher order model. The method based on the minimization of error between the transient responses of original higher order model and the reduced order model pertaining to the unit step input. The results obtained are compared with the earlier techniques utilized, to validate its ease of computation. The proposed method is illustrated through numerical example from literature.Keywords: Continuous System, Model Order Formulation, Modified Particle Swarm Optimization, Single Input Single Output, Transfer Function Approach
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17822806 Testing Loaded Programs Using Fault Injection Technique
Authors: S. Manaseer, F. A. Masooud, A. A. Sharieh
Abstract:
Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.Keywords: Complex software systems, Error detection, Fault tolerance, Injection and testing methodology, Memory faults, Process and virtual memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18862805 The Hall Coefficient and Magnetoresistance in Rectangular Quantum Wires with Infinitely High Potential under the Influence of a Laser Radiation
Authors: Nguyen Thu Huong, Nguyen Quang Bau
Abstract:
The Hall Coefficient (HC) and the Magnetoresistance (MR) have been studied in two-dimensional systems. The HC and the MR in Rectangular Quantum Wire (RQW) subjected to a crossed DC electric field and magnetic field in the presence of a Strong Electromagnetic Wave (EMW) characterized by electric field are studied in this work. Using the quantum kinetic equation for electrons interacting with optical phonons, we obtain the analytic expressions for the HC and the MR with a dependence on magnetic field, EMW frequency, temperatures of systems and the length characteristic parameters of RQW. These expressions are different from those obtained for bulk semiconductors and cylindrical quantum wires. The analytical results are applied to GaAs/GaAs/Al. For this material, MR depends on the ratio of the EMW frequency to the cyclotron frequency. Indeed, MR reaches a minimum at the ratio 5/4, and when this ratio increases, it tends towards a saturation value. The HC can take negative or positive values. Each curve has one maximum and one minimum. When magnetic field increases, the HC is negative, achieves a minimum value and then increases suddenly to a maximum with a positive value. This phenomenon differs from the one observed in cylindrical quantum wire, which does not have maximum and minimum values.Keywords: Hall coefficient, rectangular quantum wires, electron-optical phonon interaction, quantum kinetic equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18722804 Screening Post-Menopausal Women for Osteoporosis by Complex Impedance Measurements of the Dominant Arm
Authors: Fırat Matur, Yekta Ülgen
Abstract:
Cole-Cole parameters of 40 post-menopausal women are compared with their DEXA bone mineral density measurements. Impedance characteristics of four extremities are compared; left and right extremities are statistically same, but lower extremities are statistically different than upper ones due to their different fat content. The correlation of Cole-Cole impedance parameters to bone mineral density (BMD) is observed to be higher for dominant arm. With the post-menopausal population, ANOVA tests of the dominant arm characteristic frequency, as a predictor for DEXA classified osteopenic and osteoporic population around lumbar spine, is statistically very significant. When used for total lumbar spine osteoporosis diagnosis, the area under the Receiver Operating Curve of the characteristic frequency is 0.830, suggesting that the Cole-Cole plot characteristic frequency could be a useful diagnostic parameter when integrated into standard screening methods for osteoporosis. Moreover, the characteristic frequency can be directly measured by monitoring frequency driven angular behavior of the dominant arm without performing any complex calculation.Keywords: Bio-impedance spectroscopy, bone mineral density, characteristic frequency, osteoporosis, receiver operating curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24422803 Design and Implementation of Medium Access Control Based Routing on Real Wireless Sensor Networks Testbed
Authors: Smriti Agarwal, Ashish Payal, B. V. R. Reddy
Abstract:
IEEE 802.15.4 is a Low Rate Wireless Personal Area Networks (LR-WPAN) standard combined with ZigBee, which is going to enable new applications in Wireless Sensor Networks (WSNs) and Internet of Things (IoT) domain. In recent years, it has become a popular standard for WSNs. Wireless communication among sensor motes, enabled by IEEE 802.15.4 standard, is extensively replacing the existing wired technology in a wide range of monitoring and control applications. Researchers have proposed a routing framework and mechanism that interacts with the IEEE 802.15.4 standard using software platform. In this paper, we have designed and implemented MAC based routing (MBR) based on IEEE 802.15.4 standard using a hardware platform “SENSEnuts”. The experimental results include data through light and temperature sensors obtained from communication between PAN coordinator and source node through coordinator, MAC address of some modules used in the experimental setup, topology of the network created for simulation and the remaining battery power of the source node. Our experimental effort on a WSN Testbed has helped us in bridging the gap between theoretical and practical aspect of implementing IEEE 802.15.4 for WSNs applications.
Keywords: IEEE 802.15.4, routing, wireless sensor networks, ZigBee.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15822802 Effective Scheduling of Semiconductor Manufacturing using Simulation
Authors: Ingy A. El-Khouly, Khaled S. El-Kilany, Aziz E. El-Sayed
Abstract:
The process of wafer fabrication is arguably the most technologically complex and capital intensive stage in semiconductor manufacturing. This large-scale discrete-event process is highly reentrant, and involves hundreds of machines, restrictions, and processing steps. Therefore, production control of wafer fabrication facilities (fab), specifically scheduling, is one of the most challenging problems that this industry faces. Dispatching rules have been extensively applied to the scheduling problems in semiconductor manufacturing. Moreover, lot release policies are commonly used in this manufacturing setting to further improve the performance of such systems and reduce its inherent variability. In this work, simulation is used in the scheduling of re-entrant flow shop manufacturing systems with an application in semiconductor wafer fabrication; where, a simulation model has been developed for the Intel Five-Machine Six Step Mini-Fab using the ExtendTM simulation environment. The Mini-Fab has been selected as it captures the challenges involved in scheduling the highly re-entrant semiconductor manufacturing lines. A number of scenarios have been developed and have been used to evaluate the effect of different dispatching rules and lot release policies on the selected performance measures. Results of simulation showed that the performance of the Mini-Fab can be drastically improved using a combination of dispatching rules and lot release policy.Keywords: Dispatching rules, lot release policy, re-entrant flowshop, semiconductor manufacturing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2571