Search results for: simultaneous measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3322

Search results for: simultaneous measurement

1972 Direct Approach in Modeling Particle Breakage Using Discrete Element Method

Authors: Ebrahim Ghasemi Ardi, Ai Bing Yu, Run Yu Yang

Abstract:

Current study is aimed to develop an available in-house discrete element method (DEM) code and link it with direct breakage event. So, it became possible to determine the particle breakage and then its fragments size distribution, simultaneous with DEM simulation. It directly applies the particle breakage inside the DEM computation algorithm and if any breakage happens the original particle is replaced with daughters. In this way, the calculation will be followed based on a new updated particles list which is very similar to the real grinding environment. To validate developed model, a grinding ball impacting an unconfined particle bed was simulated. Since considering an entire ball mill would be too computationally demanding, this method provided a simplified environment to test the model. Accordingly, a representative volume of the ball mill was simulated inside a box, which could emulate media (ball)–powder bed impacts in a ball mill and during particle bed impact tests. Mono, binary and ternary particle beds were simulated to determine the effects of granular composition on breakage kinetics. The results obtained from the DEM simulations showed a reduction in the specific breakage rate for coarse particles in binary mixtures. The origin of this phenomenon, commonly known as cushioning or decelerated breakage in dry milling processes, was explained by the DEM simulations. Fine particles in a particle bed increase mechanical energy loss, and reduce and distribute interparticle forces thereby inhibiting the breakage of the coarse component. On the other hand, the specific breakage rate of fine particles increased due to contacts associated with coarse particles. Such phenomenon, known as acceleration, was shown to be less significant, but should be considered in future attempts to accurately quantify non-linear breakage kinetics in the modeling of dry milling processes.

Keywords: particle bed, breakage models, breakage kinetic, discrete element method

Procedia PDF Downloads 202
1971 Quantifying Stakeholders’ Values of Technical and Vocational Education and Training Provision in Nigeria

Authors: Lidimma Benjamin, Nimmyel Gwakzing, Wuyep Nanyi

Abstract:

Technical and Vocational Education and Training (TVET) has many stakeholders, each with their own values and interests. This study will focus on the diversity of the values and interests within and across groups of stakeholders by quantifying the value that stakeholders attached to several quality attributes of TVET, and also find out to what extent TVET stakeholders differ in their values. The quality of TVET therefore, depends on how well it aligns with the values and interests of these stakeholders. The five stakeholders are parents, students, teachers, policy makers, and work place training supervisors. The 9 attributes are employer appreciation of students, graduation rate, obtained computer skills of students, mentoring hours in workplace learning/Students Industrial Work Experience Scheme (SIWES), challenge, structure, students’ appreciation of teachers, schooling hours, and attention to civic education. 346 respondents (comprising Parents, Students, Teachers, Policy Makers, and Workplace Training Supervisors) were repeatedly asked to rank a set of 4 programs, each with a specific value on the nine quality indicators. Conjoint analysis was used to obtain the values that the stakeholders assigned to the 9 attributes when evaluating the quality of TVET programs. Rank-ordered logistic regression was the statistical/tool used for ranking the respondents values assign to the attributes. The similarities and diversity in values and interests of the different stakeholders will be of use by both Nigerian government and TVET colleges, to improve the overall quality of education and the match between vocational programs and their stakeholders simultaneous evaluation and combination of information in product attributes. Such approach models the decision environment by confronting a respondent with choices that are close to real-life choices. Therefore, it is more realistically than traditional survey methods.

Keywords: TVET, vignette study, conjoint analysis, quality perception, educational stakeholders

Procedia PDF Downloads 87
1970 Enzyme Treatment of Sorghum Dough: Modifications of Rheological Properties and Product Characteristics

Authors: G. K. Sruthi, Sila Bhattacharya

Abstract:

Sorghum is an important food crop in the dry tropical areas of the world, and possesses significant levels of phytochemicals and dietary fiber to offer health benefits. However, the absence of gluten is a limitation for converting the sorghum dough into sheeted/flattened/rolled products. Chapathi/roti (flat unleavened bread prepared conventionally from whole wheat flour dough) was attempted from sorghum as wheat gluten causes allergic reactions leading to celiac disease. Dynamic oscillatory rheology of sorghum flour dough (control sample) and enzyme treated sorghum doughs were studied and linked to the attributes of the finished ready-to-eat product. Enzymes like amylase, xylanase, and a mix of amylase and xylanase treated dough affected drastically the rheological behaviour causing a lowering of dough consistency. In the case of amylase treated dough, marked decrease of the storage modulus (G') values from 85513 Pa to 23041 Pa and loss modulus (G") values from 8304 Pa to 7370 Pa was noticed while the phase angle (δ) increased from 5.6 to 10.1o for treated doughs. There was a 2 and 3 fold increase in the total sugar content after α-amylase and xylanase treatment, respectively, with simultaneous changes in the structure of the dough and finished product. Scanning electron microscopy exhibited enhanced extent of changes in starch granules. Amylase and mixed enzyme treatment produced a sticky dough which was difficult to roll/flatten. The dough handling properties were improved by the use of xylanase and quality attributes of the chapath/roti. It is concluded that enzyme treatment can offer improved rheological status of gluten free doughs and products.

Keywords: sorghum dough, amylase, xylanase, dynamic oscillatory rheology, sensory assessment

Procedia PDF Downloads 405
1969 Thickness Measurement and Void Detection in Concrete Elements through Ultrasonic Pulse

Authors: Leonel Lipa Cusi, Enrique Nestor Pasquel Carbajal, Laura Marina Navarro Alvarado, José Del Álamo Carazas

Abstract:

This research analyses the accuracy of the ultrasound and the pulse echo ultrasound technic to find voids and to measure thickness of concrete elements. These mentioned air voids are simulated by polystyrene expanded and hollow containers of thin thickness made of plastic or cardboard of different sizes and shapes. These targets are distributed strategically inside concrete at different depths. For this research, a shear wave pulse echo ultrasonic device of 50 KHz is used to scan the concrete elements. Despite the small measurements of the concrete elements and because of voids’ size are near the half of the wavelength, pre and post processing steps like voltage, gain, SAFT, envelope and time compensation were made in order to improve imaging results.

Keywords: ultrasonic, concrete, thickness, pulse echo, void

Procedia PDF Downloads 340
1968 Pharmacy Practice Research's Future

Authors: Ragy Raafat Gaber Attaalla

Abstract:

Background: The research begins with a summary of the state of pharmacy practice research, both now and in the future. The concerns that are relevant to practice research are then covered in this research to set the stage. These concerns include shifts in the demography of the population, technological advancements, the institutional function of pharmacies, consumer behavior, and the pharmacy profession itself. It also describes the significant changes in pharmacy practice research, such as interprofessional collaboration and patient teaming, the description and measurement of intervention results, and the cultural diversity of patients. Methods: It would be most frequently employed in the next pharmacy practice research are highlighted in the conclusion. They cover the cultural diversity of patients, documenting and assessing the results of interventions, and interdisciplinary communication and partnership with patients. Results: The rise of large and complicated data sets, the handling of electronic health records, and the use of a wide range of mixed techniques by pharmacy practice researchers are a few potential future methodological obstacles.

Keywords: pharmacy, practice, research, significant changes

Procedia PDF Downloads 17
1967 Information Retrieval for Kafficho Language

Authors: Mareye Zeleke Mekonen

Abstract:

The Kafficho language has distinct issues in information retrieval because of its restricted resources and dearth of standardized methods. In this endeavor, with the cooperation and support of linguists and native speakers, we investigate the creation of information retrieval systems specifically designed for the Kafficho language. The Kafficho information retrieval system allows Kafficho speakers to access information easily in an efficient and effective way. Our objective is to conduct an information retrieval experiment using 220 Kafficho text files, including fifteen sample questions. Tokenization, normalization, stop word removal, stemming, and other data pre-processing chores, together with additional tasks like term weighting, were prerequisites for the vector space model to represent each page and a particular query. The three well-known measurement metrics we used for our word were Precision, Recall, and and F-measure, with values of 87%, 28%, and 35%, respectively. This demonstrates how well the Kaffiho information retrieval system performed well while utilizing the vector space paradigm.

Keywords: Kafficho, information retrieval, stemming, vector space

Procedia PDF Downloads 60
1966 1G2A IMU\GPS Integration Algorithm for Land Vehicle Navigation

Authors: O. Maklouf, Ahmed Abdulla

Abstract:

A general decline in the cost, size, and power requirements of electronics is accelerating the adoption of integrated GPS/INS technologies in consumer applications such Land Vehicle Navigation. Researchers are looking for ways to eliminate additional components from product designs. One possibility is to drop one or more of the relatively expensive gyroscopes from microelectromechanical system (MEMS) versions of inertial measurement units (IMUs). For land vehicular use, the most important gyroscope is the vertical gyro that senses the heading of the vehicle and two horizontal accelerometers for determining the velocity of the vehicle. This paper presents a simplified integration algorithm for strap down (ParIMU)\GPS combination, with data post processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of the low-cost IMU and because of the relatively small area of the trajectory.

Keywords: GPS, ParIMU, INS, Kalman filter

Procedia PDF Downloads 519
1965 Theoretical Paradigms for Total Quality Environmental Management (TQEM)

Authors: Mohammad Hossein Khasmafkan Nezam, Nader Chavoshi Boroujeni, Mohamad Reza Veshaghi

Abstract:

Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to ‘break down’, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for total quality environmental management is ineffective. We suggest that as complexity increases and we envisage intensely competitive changing environments there will be a greater need to consider a multi-paradigm integrationist view of strategy for TQEM.

Keywords: total quality management (TQM), total quality environmental management (TQEM), ideologies (philosophy), theoretical paradigms

Procedia PDF Downloads 324
1964 A Systematic Review of Situational Awareness and Cognitive Load Measurement in Driving

Authors: Aly Elshafei, Daniela Romano

Abstract:

With the development of autonomous vehicles, a human-machine interaction (HMI) system is needed for a safe transition of control when a takeover request (TOR) is required. An important part of the HMI system is the ability to monitor the level of situational awareness (SA) of any driver in real-time, in different scenarios, and without any pre-calibration. Presenting state-of-the-art machine learning models used to measure SA is the purpose of this systematic review. Investigating the limitations of each type of sensor, the gaps, and the most suited sensor and computational model that can be used in driving applications. To the author’s best knowledge this is the first literature review identifying online and offline classification methods used to measure SA, explaining which measurements are subject or session-specific, and how many classifications can be done with each classification model. This information can be very useful for researchers measuring SA to identify the most suited model to measure SA for different applications.

Keywords: situational awareness, autonomous driving, gaze metrics, EEG, ECG

Procedia PDF Downloads 122
1963 Simulation and Analytical Investigation of Different Combination of Single Phase Power Transformers

Authors: M. Salih Taci, N. Tayebi, I. Bozkır

Abstract:

In this paper, the equivalent circuit of the ideal single-phase power transformer with its appropriate voltage current measurement was presented. The calculated values of the voltages and currents of the different connections single phase normal transformer and the results of the simulation process are compared. As it can be seen, the calculated results are the same as the simulated results. This paper includes eight possible different transformer connections. Depending on the desired voltage level, step-down and step-up application transformer is considered. Modelling and analysis of a system consisting of an equivalent source, transformer (primary and secondary), and loads are performed to investigate the combinations. The obtained values are simulated in PSpice environment and then how the currents, voltages and phase angle are distributed between them is explained based on calculation.

Keywords: transformer, simulation, equivalent model, parallel series combinations

Procedia PDF Downloads 368
1962 Assessment of Noise Pollution in the City of Biskra, Algeria

Authors: Tallal Abdel Karim Bouzir, Nourdinne Zemmouri, Djihed Berkouk

Abstract:

In this research, a quantitative assessment of the urban sound environment of the city of Biskra, Algeria, was conducted. To determine the quality of the soundscape based on in-situ measurement, using a Landtek SL5868P sound level meter in 47 points, which have been identified to represent the whole city. The result shows that the urban noise level varies from 55.3 dB to 75.8 dB during the weekdays and from 51.7 dB to 74.3 dB during the weekend. On the other hand, we can also note that 70.20% of the results of the weekday measurements and 55.30% of the results of the weekend measurements have levels of sound intensity that exceed the levels allowed by Algerian law and the recommendations of the World Health Organization. These very high urban noise levels affect the quality of life, the acoustic comfort and may even pose multiple risks to people's health.

Keywords: road traffic, noise pollution, sound intensity, public health

Procedia PDF Downloads 272
1961 Experimental Investigation of On-Body Channel Modelling at 2.45 GHz

Authors: Hasliza A. Rahim, Fareq Malek, Nur A. M. Affendi, Azuwa Ali, Norshafinash Saudin, Latifah Mohamed

Abstract:

This paper presents the experimental investigation of on-body channel fading at 2.45 GHz considering two effects of the user body movement; stationary and mobile. A pair of body-worn antennas was utilized in this measurement campaign. A statistical analysis was performed by comparing the measured on-body path loss to five well-known distributions; lognormal, normal, Nakagami, Weibull and Rayleigh. The results showed that the average path loss of moving arm varied higher than the path loss in sitting position for upper-arm-to-left-chest link, up to 3.5 dB. The analysis also concluded that the Nakagami distribution provided the best fit for most of on-body static link path loss in standing still and sitting position, while the arm movement can be best described by log-normal distribution.

Keywords: on-body channel communications, fading characteristics, statistical model, body movement

Procedia PDF Downloads 359
1960 Relationship of Workplace Stress and Mental Wellbeing among Health Professionals

Authors: Rabia Mushtaq, Uroosa Javaid

Abstract:

It has been observed that health professionals are at higher danger of stress in light of the fact that being a specialist is physically and emotionally demanding. The study aimed to investigate the relationship between workplace stress and mental wellbeing among health professionals. Sample of 120 male and female health professionals belonging to two age groups, i.e., early adulthood and middle adulthood, was employed through purposive sampling technique. Job stress scale, mindful attention awareness scale, and Warwick Edinburgh mental wellbeing scales were used for the measurement of study variables. Results of the study indicated that job stress has a significant negative relationship with mental wellbeing among health professionals. The current study opened the door for more exploratory work on mindfulness among health professionals. Yielding outcomes helped in consolidating adapting procedures among workers to improve their mental wellbeing and lessen the job stress.

Keywords: health professionals, job stress, mental wellbeing, mindfulness

Procedia PDF Downloads 178
1959 Possible Exposure of Persons with Cardiac Pacemakers to Extremely Low Frequency (ELF) Electric and Magnetic Fields

Authors: Leena Korpinen, Rauno Pääkkönen, Fabriziomaria Gobba, Vesa Virtanen

Abstract:

The number of persons with implanted cardiac pacemakers (PM) has increased in Western countries. The aim of this paper is to investigate the possible situations where persons with a PM may be exposed to extremely low frequency (ELF) electric (EF) and magnetic fields (MF) that may disturb their PM. Based on our earlier studies, it is possible to find such high public exposure to EFs only in some places near 400 kV power lines, where an EF may disturb a PM in unipolar mode. Such EFs cannot be found near 110 kV power lines. Disturbing MFs can be found near welding machines. However, we do not have measurement data from welding. Based on literature and earlier studies at Tampere University of Technology, it is difficult to find public EF or MF exposure that is high enough to interfere with PMs.

Keywords: cardiac pacemaker, electric field, magnetic field, electrical engineering

Procedia PDF Downloads 436
1958 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 135
1957 Visual Odometry and Trajectory Reconstruction for UAVs

Authors: Sandro Bartolini, Alessandro Mecocci, Alessio Medaglini

Abstract:

The growing popularity of systems based on unmanned aerial vehicles (UAVs) is highlighting their vulnerability, particularly in relation to the positioning system used. Typically, UAV architectures use the civilian GPS, which is exposed to a number of different attacks, such as jamming or spoofing. This is why it is important to develop alternative methodologies to accurately estimate the actual UAV position without relying on GPS measurements only. In this paper, we propose a position estimate method for UAVs based on monocular visual odometry. We have developed a flight control system capable of keeping track of the entire trajectory travelled, with a reduced dependency on the availability of GPS signals. Moreover, the simplicity of the developed solution makes it applicable to a wide range of commercial drones. The final goal is to allow for safer flights in all conditions, even under cyber-attacks trying to deceive the drone.

Keywords: visual odometry, autonomous uav, position measurement, autonomous outdoor flight

Procedia PDF Downloads 224
1956 The Impact of Online Advertising on Consumer Purchase Behaviour Based on Malaysian Organizations

Authors: Naser Zourikalatehsamad, Seyed Abdorreza Payambarpour, Ibrahim Alwashali, Zahra Abdolkarimi

Abstract:

The paper aims to evaluate the effect of online advertising on consumer purchase behavior in Malaysian organizations. The paper has potential to extend and refine theory. A survey was distributed among Students of UTM university during the winter 2014 and 160 responses were collected. Regression analysis was used to test the hypothesized relationships of the model. Result shows that the predictors (cost saving factor, convenience factor and customized product or services) have positive impact on intention to continue seeking online advertising.

Keywords: consumer purchase, convenience, customized product, cost saving, customization, flow theory, mass communication, online advertising ads, online advertising measurement, online advertising mechanism, online intelligence system, self-confidence, willingness to purchase

Procedia PDF Downloads 485
1955 A Comparison Study: Infant and Children’s Clothing Size Charts in South Korea and UK

Authors: Hye-Won Lim, Tom Cassidy, Tracy Cassidy

Abstract:

Infant and children’s body shapes are changing constantly while they are growing up into adults and are also distinctive physically between countries. For this reason, optimum size charts which can represent body sizes and shapes of infants and children are required. In this study, investigations of current size charts in South Korea and UK (n=50 each) were conducted for understanding and figuring out the sizing perspectives of the clothing manufacturers. The size charts of the two countries were collected randomly from online shopping websites and those size charts’ average measurements were compared with both national sizing surveys (SizeKorea and Shape GB). The size charts were also classified by age, gender, clothing type, fitting, and other factors. In addition, the key measurement body parts of size charts of each country were determined and those will be suggested for new size charts and sizing system development.

Keywords: infant clothing, children’s clothing, body shapes, size charts

Procedia PDF Downloads 320
1954 Simultaneous Adsorption and Characterization of NOx and SOx Emissions from Power Generation Plant on Sliced Porous Activated Carbon Prepared by Physical Activation

Authors: Muhammad Shoaib, Hassan M. Al-Swaidan

Abstract:

Air pollution has been a major challenge for the scientists today, due to the release of toxic emissions from various industries like power plants, desalination plants, industrial processes and transportation vehicles. Harmful emissions into the air represent an environmental pressure that reflects negatively on human health and productivity, thus leading to a real loss in the national economy. Variety of air pollutants in the form of carbon oxides, hydrocarbons, nitrogen oxides, sulfur oxides, suspended particulate material etc. are present in air due to the combustion of different types of fuels like crude oil, diesel oil and natural gas. Among various pollutants, NOx and SOx emissions are considered as highly toxic due to its carcinogenicity and its relation with various health disorders. In Kingdom of Saudi Arabia electricity is generated by burning of crude, diesel or natural gas in the turbines of electricity stations. Out of these three, crude oil is used extensively for electricity generation. Due to the burning of the crude oil there are heavy contents of gaseous pollutants like sulfur dioxides (SOx) and nitrogen oxides (NOx), gases which are ultimately discharged in to the environment and is a serious environmental threat. The breakthrough point in case of lab studies using 1 gm of sliced activated carbon adsorbant comes after 20 and 30 minutes for NOx and SOx, respectively, whereas in case of PP8 plant breakthrough point comes in seconds. The saturation point in case of lab studies comes after 100 and 120 minutes and for actual PP8 plant it comes after 60 and 90 minutes for NOx and SOx adsorption, respectively. Surface characterization of NOx and SOx adsorption on SAC confirms the presence of peaks in the FT-IR spectrum. CHNS study verifies that the SAC is suitable for NOx and SOx along with some other C and H containing compounds coming out from stack emission stream from the turbines of a power plant.

Keywords: activated carbon, flue gases, NOx and SOx adsorption, physical activation, power plants

Procedia PDF Downloads 348
1953 Measurement of VIP Edge Conduction Using Vacuum Guarded Hot Plate

Authors: Bongsu Choi, Tae-Ho Song

Abstract:

Vacuum insulation panel (VIP) is a promising thermal insulator for buildings, refrigerator, LNG carrier and so on. In general, it has the thermal conductivity of 2~4 mW/m•K. However, this thermal conductivity is that measured at the center of VIP. The total effective thermal conductivity of VIP is larger than this value due to the edge conduction through the envelope. In this paper, the edge conduction of VIP is examined theoretically, numerically and experimentally. To confirm the existence of the edge conduction, numerical analysis is performed for simple two-dimensional VIP model and a theoretical model is proposed to calculate the edge conductivity. Also, the edge conductivity is measured using the vacuum guarded hot plate and the experiment is validated against numerical analysis. The results show that the edge conductivity is dependent on the width of panel and thickness of Al-foil. To reduce the edge conduction, it is recommended that the VIP should be made as big as possible or made of thin Al film envelope.

Keywords: envelope, edge conduction, thermal conductivity, vacuum insulation panel

Procedia PDF Downloads 408
1952 Analysis of Transformer by Gas and Moisture Sensor during Laboratory Time Monitoring

Authors: Miroslav Gutten, Daniel Korenciak, Milan Simko, Milan Chupac

Abstract:

Ensure the reliable and correct function of transformers is the main essence of on-line non-destructive diagnostic tool, which allows the accurately track of the status parameters. Devices for on-line diagnostics are very costly. However, there are devices, whose price is relatively low and when used correctly, they can be executed a complex diagnostics. One of these devices is sensor HYDRAN M2, which is used to detect the moisture and gas content in the insulation oil. Using the sensor HYDRAN M2 in combination with temperature, load measurement, and physicochemical analysis can be made the economically inexpensive diagnostic system, which use is not restricted to distribution transformers. This system was tested in educational laboratory environment at measured oil transformer 22/0.4 kV. From the conclusions referred in article is possible to determine, which kind of fault was occurred in the transformer and how was an impact on the temperature, evolution of gases and water content.

Keywords: transformer, diagnostics, gas and moisture sensor, monitoring

Procedia PDF Downloads 391
1951 Guests’ Satisfaction and Intention to Revisit Smart Hotels: Qualitative Interviews Approach

Authors: Raymond Chi Fai Si Tou, Jacey Ja Young Choe, Amy Siu Ian So

Abstract:

Smart hotels can be defined as the hotel which has an intelligent system, through digitalization and networking which achieve hotel management and service information. In addition, smart hotels include high-end designs that integrate information and communication technology with hotel management fulfilling the guests’ needs and improving the quality, efficiency and satisfaction of hotel management. The purpose of this study is to identify appropriate factors that may influence guests’ satisfaction and intention to revisit Smart Hotels based on service quality measurement of lodging quality index and extended UTAUT theory. Unified Theory of Acceptance and Use of Technology (UTAUT) is adopted as a framework to explain technology acceptance and use. Since smart hotels are technology-based infrastructure hotels, UTATU theory could be as the theoretical background to examine the guests’ acceptance and use after staying in smart hotels. The UTAUT identifies four key drivers of the adoption of information systems: performance expectancy, effort expectancy, social influence, and facilitating conditions. The extended UTAUT modifies the definitions of the seven constructs for consideration; the four previously cited constructs of the UTAUT model together with three new additional constructs, which including hedonic motivation, price value and habit. Thus, the seven constructs from the extended UTAUT theory could be adopted to understand their intention to revisit smart hotels. The service quality model will also be adopted and integrated into the framework to understand the guests’ intention of smart hotels. There are rare studies to examine the service quality on guests’ satisfaction and intention to revisit in smart hotels. In this study, Lodging Quality Index (LQI) will be adopted to measure the service quality in smart hotels. Using integrated UTAUT theory and service quality model because technological applications and services require using more than one model to understand the complicated situation for customers’ acceptance of new technology. Moreover, an integrated model could provide more perspective insights to explain the relationships of the constructs that could not be obtained from only one model. For this research, ten in-depth interviews are planned to recruit this study. In order to confirm the applicability of the proposed framework and gain an overview of the guest experience of smart hotels from the hospitality industry, in-depth interviews with the hotel guests and industry practitioners will be accomplished. In terms of the theoretical contribution, it predicts that the integrated models from the UTAUT theory and the service quality will provide new insights to understand factors that influence the guests’ satisfaction and intention to revisit smart hotels. After this study identifies influential factors, smart hotel practitioners could understand which factors may significantly influence smart hotel guests’ satisfaction and intention to revisit. In addition, smart hotel practitioners could also provide outstanding guests experience by improving their service quality based on the identified dimensions from the service quality measurement. Thus, it will be beneficial to the sustainability of the smart hotels business.

Keywords: intention to revisit, guest satisfaction, qualitative interviews, smart hotels

Procedia PDF Downloads 213
1950 Identifying Unknown Dynamic Forces Applied on Two Dimensional Frames

Authors: H. Katkhuda

Abstract:

A time domain approach is used in this paper to identify unknown dynamic forces applied on two dimensional frames using the measured dynamic structural responses for a sub-structure in the two dimensional frame. In this paper a sub-structure finite element model with short length of measurement from only three or four accelerometers is required, and an iterative least-square algorithm is used to identify the unknown dynamic force applied on the structure. Validity of the method is demonstrated with numerical examples using noise-free and noise-contaminated structural responses. Both harmonic and impulsive forces are studied. The results show that the proposed approach can identify unknown dynamic forces within very limited iterations with high accuracy and shows its robustness even noise- polluted dynamic response measurements are utilized.

Keywords: dynamic force identification, dynamic responses, sub-structure, time domain

Procedia PDF Downloads 364
1949 Performance Prediction Methodology of Slow Aging Assets

Authors: M. Ben Slimene, M.-S. Ouali

Abstract:

Asset management of urban infrastructures faces a multitude of challenges that need to be overcome to obtain a reliable measurement of performances. Predicting the performance of slowly aging systems is one of those challenges, which helps the asset manager to investigate specific failure modes and to undertake the appropriate maintenance and rehabilitation interventions to avoid catastrophic failures as well as to optimize the maintenance costs. This article presents a methodology for modeling the deterioration of slowly degrading assets based on an operating history. It consists of extracting degradation profiles by grouping together assets that exhibit similar degradation sequences using an unsupervised classification technique derived from artificial intelligence. The obtained clusters are used to build the performance prediction models. This methodology is applied to a sample of a stormwater drainage culvert dataset.

Keywords: artificial Intelligence, clustering, culvert, regression model, slow degradation

Procedia PDF Downloads 115
1948 Internet, Fake News, and Democracy: The Case of Kosovo

Authors: Agrinë Baraku

Abstract:

This paper focuses on the convergence of the internet, fake news, and democracy. This paper will examine the convergence of these concepts, the tenets of democracy which are affected by the ever-increasing exposure to fake news, and whether the impact strengthens or can further weaken countries with fragile democracies. To demonstrate the convergence and the impact and to further the discussion about this topic, the case of Kosovo is explored. Its position in the Western Balkans makes it even more susceptible to the pressure stemming from geopolitical interests, which intersect with the generation of fake news by different international actors. Domestically, through data generated by Kantar (Index) Kosova Longitudinal Study on Media Measurement Survey (MMS), which focused on media viewership, the trend among Kosovar citizens is traced and then inserted into a bigger landscape, which is compounded by tenuous circumstances and challenges that Kosovo faces. Attention will be paid to what this can tell about where Kosovo currently is and the possibilities of what can be done regarding the phenomenon that is taking place.

Keywords: democracy, disinformation, internet, social media, fake news

Procedia PDF Downloads 94
1947 Low-Density Lipoproteins Mediated Delivery of Paclitaxel and MRI Imaging Probes for Personalized Medicine Applications

Authors: Sahar Rakhshan, Simonetta Geninatti Crich, Diego Alberti, Rachele Stefania

Abstract:

The combination of imaging and therapeutic agents in the same smart nanoparticle is a promising option to perform a minimally invasive imaging guided therapy. In this study, Low density lipoproteins (LDL), one of the most attractive biodegradable and biocompatible nanoparticles, were used for the simultaneous delivery of Paclitaxel (PTX), a hydrophobic antitumour drug and an amphiphilic contrast agent, Gd-AAZTA-C17, in B16-F10 melanoma cell line. These cells overexpress LDL receptors, as assessed by Flow cytometry analysis. PTX and Gd-AAZTA-C17 loaded LDLs (LDL-PTX-Gd) have been prepared, characterized and their stability was assessed under 72 h incubation at 37 ◦C and compared to LDL loaded with Gd-AAZTA-C17 (LDL-Gd) and LDL-PTX. The cytotoxic effect of LDL-PTX-Gd was evaluated by MTT assay. The anti-tumour drug loaded into LDLs showed a significantly higher toxicity on B16-F10 cells with respect to the commercially available formulation Paclitaxel Kabi (PTX Kabi) used in clinical applications. It was possible to demonstrate a high uptake of LDL-Gd in B16-F10 cells. As a consequence of the high cell uptake, melanoma cells showed significantly high cytotoxic effect when incubated in the presence of PTX (LDL-PTX-Gd). Furthermore, B16-F10 have been used to perform Magnetic Resonance Imaging. By the analysis of the image signal intensity, it was possible to extrapolate the amount of internalized PTX indirectly by the decrease of relaxation times caused by Gd, proportional to its concentration. Finally, the treatment with PTX loaded LDL on B16-F10 tumour bearing mice resulted in a marked reduction of tumour growth compared to the administration of PTX Kabi alone. In conclusion, LDLs are selectively taken-up by tumour cells and can be successfully exploited for the selective delivery of Paclitaxel and imaging agents.

Keywords: low density lipoprotein, melanoma cell lines, MRI, paclitaxel, personalized medicine application, theragnostic System

Procedia PDF Downloads 131
1946 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data

Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu

Abstract:

Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.

Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq

Procedia PDF Downloads 146
1945 Unconventional Dating of Old Peepal Tree of Chandigarh (India) Using Optically Stimulated Luminescence

Authors: Rita Rani, Ramesh Kumar

Abstract:

The intend of the current study is to date an old grand Peepal tree that is still alive. The tree is situated in Kalibard village, Sector 9, Chandigarh (India). Due to its huge structure, it has got the status of ‘Heritage tree.’ Optically Stimulated Luminescence of sediments beneath the roots is used to determine the age of the tree. Optical dating is preferred over conventional dating methods due to more precession. The methodology includes OSL of quartz grain using SAR protocol for accumulated dose measurement. The age determination of an alive tree using sedimentary quartz is in close agreement with the approximated age provided by the related agency. This is the first attempt at using optically stimulated luminescence in the age determination of alive trees in this region. The study concludes that the Luminescence dating of alive trees is the nondestructive and more precise method.

Keywords: luminescence, dose rate, optical dating, sediments

Procedia PDF Downloads 180
1944 Features for Measuring Credibility on Facebook Information

Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan

Abstract:

Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.

Keywords: facebook, social media, credibility measurement, internet

Procedia PDF Downloads 357
1943 Numerical Study of Fiber Bragg Grating Sensor: Longitudinal and Transverse Detection of Temperature and Strain

Authors: K. Khelil, H. Ammar, K. Saouchi

Abstract:

Fiber Bragg Grating (FBG) structure is an periodically modulated optical fiber. It acts as a selective filter of wavelength whose reflected peak is called Bragg wavelength and it depends on the period of the fiber and the refractive index. The simulation of FBG is based on solving the Coupled Mode Theory equation by using the Transfer Matrix Method which is carried out using MATLAB. It is found that spectral reflectivity is shifted when the change of temperature and strain is uniform. Under non-uniform temperature or strain perturbation, the spectrum is both shifted and destroyed. In case of transverse loading, reflectivity spectrum is split into two peaks, the first is specific to X axis, and the second belongs to Y axis. FBGs are used in civil engineering to detect perturbations applied to buildings.

Keywords: Bragg wavelength, coupled mode theory, optical fiber, temperature measurement

Procedia PDF Downloads 498