Search results for: time to surgery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18572

Search results for: time to surgery

10772 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 125
10771 Detecting the Blood of Femoral and Carotid Artery of Swine Using Photoacoustic Tomography in-vivo

Authors: M. Y. Lee, S. H. Park, S. M. Yu, H. S. Jo, C. G. Song

Abstract:

Photoacoustic imaging is the imaging technology that combines the optical imaging with ultrasound. It also provides the high contrast and resolution due to optical and ultrasound imaging, respectively. For these reasons, many studies take experiment in order to apply this method for many diagnoses. We developed the real-time photoacoustic tomography (PAT) system using linear-ultrasound transducer. In this study, we conduct the experiment using swine and detect the blood of carotid artery and femoral artery. We measured the blood of femoral and carotid artery of swine and reconstructed the image using 950nm due to the HbO₂ absorption coefficient. The photoacoustic image is overlaid with ultrasound image in order to match the position. In blood of artery, major composition of blood is HbO₂. In this result, we can measure the blood of artery.

Keywords: photoacoustic tomography, swine artery, carotid artery, femoral artery

Procedia PDF Downloads 238
10770 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 503
10769 Comprehensive Studio Tables: Improving Performance and Quality of Student's Work in Architecture Studio

Authors: Maryam Kalkatechi

Abstract:

Architecture students spent most of their qualitative time in studios during their years of study. The studio table’s importance as furniture in the studio is that it elevates the quality of the projects and positively influences the student’s productivity. This paper first describes the aspects considered in designing comprehensive studio table and later details on each aspect. Comprehensive studio tables are meant to transform the studio space to an efficient yet immense place of learning, collaboration, and participation. One aspect of these tables is that the surface transforms to a place of accommodation for design conversations, the other aspect of these tables is the efficient interactive platform of the tools. The discussion factors of the comprehensive studio include; the comprehensive studio setting of workspaces, the arrangement of the comprehensive studio tables, the collaboration aspects in the studio, the studio display and lightings shaped by the tables and lighting of the studio.

Keywords: studio tables, student performance, productivity, hologram, 3D printer

Procedia PDF Downloads 175
10768 Formulation Policy of Criminal Sanction in Indonesian Criminal Justice System

Authors: Dini Dewi Heniarti

Abstract:

This One of criminal sanctions that are often imposed by the judge is imprisonment. The issue on the imposition of imprisonment has been subject of contentious debate and criticism among various groups for a long time. In practice, the problematics of imprisonment lead to complicated problems. The impact of the reckless imposition of the imprisonment includes among others overcapacity of the correctional institution and increasing crimes within the correctional facilities. Therefore, there is a need for renewal of the existing condemnation paradigm, considering the developing phenomena associated with the penal imposition. Imprisonment as one element of the Indonesian penal system is an important and integral part of the other elements. The philosophy of the current penal system, which still refers to the Criminal Code, still carries the values of retaliation and fault-finding toward the offender. Therefore, it is important to reconstruct a new thought in order to realize a penal system that is represented in the formulation of a more humanistic criminal sanction

Keywords: criminal code, criminal sanction, Indonesian legal system, reconstruction of thought

Procedia PDF Downloads 217
10767 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 298
10766 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data

Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee

Abstract:

Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.

Keywords: data mining, evaluating new technology, technology opportunity, patent analysis

Procedia PDF Downloads 360
10765 Eu³⁺ Ions Doped-SnO₂ for Effective Degradation of Malachite Green Dye

Authors: Ritu Malik, Vijay K. Tomer, Satya P. Nehra, Anshu Nehra

Abstract:

Visible light sensitive Eu³⁺ doped-SnO₂ nanoparticles were successfully synthesized via the hydrothermal method and extensively characterized by a combination of X-ray diffraction (XRD), Field emission scanning electron microscopy (FESEM) and N₂ adsorption-desorption isotherms (BET). Their photocatalytic activities were evaluated using Malachite Green (MG) as decomposition objective by varying the concentration of Eu³⁺ in SnO₂. The XRD analysis showed that lanthanides phase was not observed on lower loadings of Eu³⁺ ions doped-SnO₂. Eu³⁺ ions can enhance the photocatalytic activity of SnO₂ to some extent as compared with pure SnO₂, and it was found that 3 wt% Eu³⁺ -doped SnO₂ is the most effective photocatalyst due to its lowest band gap, crystallite size and also the highest surface area. The photocatalytic tests indicate that at the optimum conditions, illumination time 40 min, pH 65, 0.3 g/L photocatalyst loading and 50 ppm dye concentration, the dye removal efficiency was 98%.

Keywords: photocatalyst, visible light, lanthanide, SnO₂

Procedia PDF Downloads 270
10764 The Association of Cone-Shaped Epiphysis and Poland Syndrome: A Case Report

Authors: Mohammad Alqattan, Tala Alkhunani, Reema Al, Aldawish, Felwa Almurshard, Abdullah Alzahrani

Abstract:

: Poland’s Syndrome is a congenital anomaly with two clinical features : unilateral agenesis of the pectoralis major and ipsilateral hand symbrachydactyly. Case presentation: We report a rare case of bilateral Poland’s syndrome with several unique features. Discussion: Poland’s syndrome is thought to be due to a vascular insult to the subclavian axis around the 6th week of gestation. Our patient has multiple rare and unique features of Poland’s syndrome. Conclusion: To our best knowledge, for the first time in the literature we associate Poland’s syndrome with cone-shaped epiphysis of the metacarpals of all fingers. Bilaterality, cleft hand deformity, and dextrocardia, were also rare features in our patient.

Keywords: Poland's syndrome, cleft hand deformity, bilaterality, dextrocardia, cone-shaped epiphysis

Procedia PDF Downloads 111
10763 Design of Membership Ranges for Fuzzy Logic Control of Refrigeration Cycle Driven by a Variable Speed Compressor

Authors: Changho Han, Jaemin Lee, Li Hua, Seokkwon Jeong

Abstract:

Design of membership function ranges in fuzzy logic control (FLC) is presented for robust control of a variable speed refrigeration system (VSRS). The criterion values of the membership function ranges can be carried out from the static experimental data, and two different values are offered to compare control performance. Some simulations and real experiments for the VSRS were conducted to verify the validity of the designed membership functions. The experimental results showed good agreement with the simulation results, and the error change rate and its sampling time strongly affected the control performance at transient state of the VSRS.

Keywords: variable speed refrigeration system, fuzzy logic control, membership function range, control performance

Procedia PDF Downloads 251
10762 Automatic Number Plate Recognition System Based on Deep Learning

Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi

Abstract:

In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.

Keywords: ANPR, CS, CNN, deep learning, NPL

Procedia PDF Downloads 290
10761 Using Google Distance Matrix Application Programming Interface to Reveal and Handle Urban Road Congestion Hot Spots: A Case Study from Budapest

Authors: Peter Baji

Abstract:

In recent years, a growing body of literature emphasizes the increasingly negative impacts of urban road congestion in the everyday life of citizens. Although there are different responses from the public sector to decrease traffic congestion in urban regions, the most effective public intervention is using congestion charges. Because travel is an economic asset, its consumption can be controlled by extra taxes or prices effectively, but this demand-side intervention is often unpopular. Measuring traffic flows with the help of different methods has a long history in transport sciences, but until recently, there was not enough sufficient data for evaluating road traffic flow patterns on the scale of an entire road system of a larger urban area. European cities (e.g., London, Stockholm, Milan), in which congestion charges have already been introduced, designated a particular zone in their downtown for paying, but it protects only the users and inhabitants of the CBD (Central Business District) area. Through the use of Google Maps data as a resource for revealing urban road traffic flow patterns, this paper aims to provide a solution for a fairer and smarter congestion pricing method in cities. The case study area of the research contains three bordering districts of Budapest which are linked by one main road. The first district (5th) is the original downtown that is affected by the congestion charge plans of the city. The second district (13th) lies in the transition zone, and it has recently been transformed into a new CBD containing the biggest office zone in Budapest. The third district (4th) is a mainly residential type of area on the outskirts of the city. The raw data of the research was collected with the help of Google’s Distance Matrix API (Application Programming Interface) which provides future estimated traffic data via travel times between freely fixed coordinate pairs. From the difference of free flow and congested travel time data, the daily congestion patterns and hot spots are detectable in all measured roads within the area. The results suggest that the distribution of congestion peak times and hot spots are uneven in the examined area; however, there are frequently congested areas which lie outside the downtown and their inhabitants also need some protection. The conclusion of this case study is that cities can develop a real-time and place-based congestion charge system that forces car users to avoid frequently congested roads by changing their routes or travel modes. This would be a fairer solution for decreasing the negative environmental effects of the urban road transportation instead of protecting a very limited downtown area.

Keywords: Budapest, congestion charge, distance matrix API, application programming interface, pilot study

Procedia PDF Downloads 181
10760 Sensorless Controller of Induction Motor Using Backstepping Approach and Fuzzy MRAS

Authors: Ahmed Abbou

Abstract:

This paper present a sensorless controller designed by the backstepping approach for the speed control of induction motor. In this strategy of control, we also combined the method Fuzzy MRAS to estimate the rotor speed and the observer type Luenburger to observe Rotor flux. The control model involves a division by the flux variable that may lead to unbounded solutions. Such a risk is avoided by basing the controller design on Lyapunov function that accounts for the model singularity. On the other hand, this mixed method gives better results in Sensorless operation and especially at low speed. The response time at 5% of the flux is 20ms while the error between the speed with sensor and the estimated speed remains in the range of ±0.8 rad/s for the rated functioning and ±1.5 rad/s for low speed.

Keywords: backstepping approach, fuzzy logic, induction motor, luenburger observer, sensorless MRAS

Procedia PDF Downloads 360
10759 Study on the Factors that Causes the Malaysian Oil and Gas Equipment (OGSE) Companies being under-Developing

Authors: Low Khee Wai

Abstract:

Lossing of opportunity by Malaysian Oil and Gas Services Equipment (OGSE) companies can be a major issue in developing and sustain Malaysia’s own Oil & Gas Industry. Despite the rapid growth of Oil & Gas industry in Malaysia for the past 40 years, Malaysia still not developing sufficient OGSE companies in order to support its own Oil & Gas Industry. In examining the scenario, this study aims to identify the factors causing the under-developing of OGSE companies in Malaysia. Conceptual Review method were used to analyse the factors that cause the under-development of Malaysia OGSE. The 4 factors identified were Time, Cost, Human Resource and Stakeholder Management. This survey explained the phenomena and the challenge of the industry and translated into the factors that cause the under-developing of OGSE companies in Malaysia. Finally, it should bring awareness to the government, authorities, and stakeholder in order to improve the ecology of Oil & Gas Industry in Malaysia.

Keywords: oil & gas in Malaysia, Malaysia local oil & gas services equipment (OGSE), oil & gas project management, project performance

Procedia PDF Downloads 121
10758 Field Performance of Cement Treated Bases as a Reflective Crack Mitigation Technique for Flexible Pavements

Authors: Mohammad R. Bhuyan, Mohammad J. Khattak

Abstract:

Deterioration of flexible pavements due to crack reflection from its soil-cement base layer is a major concern around the globe. The service life of flexible pavement diminishes significantly because of the reflective cracks. Highway agencies are struggling for decades to prevent or mitigate these cracks in order to increase pavement service lives. The root cause of reflective cracks is the shrinkage crack which occurs in the soil-cement bases during the cement hydration process. The primary factor that causes the shrinkage is the cement content of the soil-cement mixture. With the increase of cement content, the soil-cement base gains strength and durability, which is necessary to withstand the traffic loads. But at the same time, higher cement content creates more shrinkage resulting in more reflective cracks in pavements. Historically, various states of USA have used the soil-cement bases for constructing flexile pavements. State of Louisiana (USA) had been using 8 to 10 percent of cement content to manufacture the soil-cement bases. Such traditional soil-cement bases yield 2.0 MPa (300 psi) 7-day compressive strength and are termed as cement stabilized design (CSD). As these CSD bases generate significant reflective cracks, another design of soil-cement base has been utilized by adding 4 to 6 percent of cement content called cement treated design (CTD), which yields 1.0 MPa (150 psi) 7-day compressive strength. The reduction of cement content in the CTD base is expected to minimize shrinkage cracks thus increasing pavement service lives. Hence, this research study evaluates the long-term field performance of CTD bases with respect to CSD bases used in flexible pavements. Pavement Management System of the state of Louisiana was utilized to select flexible pavement projects with CSD and CTD bases that had good historical record and time-series distress performance data. It should be noted that the state collects roughness and distress data for 1/10th mile section every 2-year period. In total, 120 CSD and CTD projects were analyzed in this research, where more than 145 miles (CTD) and 175 miles (CSD) of roadways data were accepted for performance evaluation and benefit-cost analyses. Here, the service life extension and area based on distress performance were considered as benefits. It was found that CTD bases increased 1 to 5 years of pavement service lives based on transverse cracking as compared to CSD bases. On the other hand, the service lives based on longitudinal and alligator cracking, rutting and roughness index remain the same. Hence, CTD bases provide some service life extension (2.6 years, on average) to the controlling distress; transverse cracking, but it was inexpensive due to its lesser cement content. Consequently, CTD bases become 20% more cost-effective than the traditional CSD bases, when both bases were compared by net benefit-cost ratio obtained from all distress types.

Keywords: cement treated base, cement stabilized base, reflective cracking , service life, flexible pavement

Procedia PDF Downloads 156
10757 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions

Authors: Pirta Palola, Richard Bailey, Lisa Wedding

Abstract:

Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.

Keywords: economics of biodiversity, environmental valuation, natural capital, value function

Procedia PDF Downloads 185
10756 A Simple Light-Outcoupling Enhancement Method for Organic Light-Emitting Diodes

Authors: Ho-Nyeon Lee

Abstract:

We propose to use a gradual-refractive-index dielectric (GRID) as a simple and efficient light-outcoupling method for organic light-emitting diodes (OLEDs). Using the simple GRIDs, we could improve the light outcoupling efficiency of OLEDs rather than relying on difficult nano-patterning processes. Through numerical simulations using a finite-difference time-domain (FDTD) method, the feasibility of the GRID structure was examined and the design parameters were extracted. The outcoupling enhancement effects due to the GRIDs were proved through severe experimental works. The GRIDs were adapted to bottom-emission OLEDs and top-emission OLEDs. For bottom-emission OLEDs, the efficiency was improved more than 20%, and for top-emission OLEDs, more than 40%. The detailed numerical and experimental results will be presented at the conference site.

Keywords: efficiency, GRID, light outcoupling, OLED

Procedia PDF Downloads 411
10755 New Desiccant Solar Unit for Air Conditioning and Desalination: Study of the Compartments of Desalination and Water Condensation

Authors: Zied Guidara, Alexander Morgenstern, Aref Maalej

Abstract:

In this paper, a new desiccant solar unit for air conditioning and desalination is presented first. Secondly, a dynamic modelling study of the desiccant wheel is developed. After that, a simulation study and an experimental investigation of the behaviour of the desiccant wheel are developed. The experimental investigation is done in the chamber of commerce in Freiburg-Germany. Indeed, the variations of calculated and measured temperatures and specific humidity of dehumidified and rejected air are presented where a good agreement is found when comparing the model predictions with experimental data under the considered range of operating conditions. Finally, the study of the compartments of desalination and water condensation shows that the unit can produce an acceptable quantity of water at the same time of the air conditioning operation.

Keywords: air conditioning, desalination, condensation, design, desiccant wheel, modelling, experimental investigation

Procedia PDF Downloads 372
10754 Dynamic Exergy Analysis for the Built Environment: Fixed or Variable Reference State

Authors: Valentina Bonetti

Abstract:

Exergy analysis successfully helps optimizing processes in various sectors. In the built environment, a second-law approach can enhance potential interactions between constructions and their surrounding environment and minimise fossil fuel requirements. Despite the research done in this field in the last decades, practical applications are hard to encounter, and few integrated exergy simulators are available for building designers. Undoubtedly, an obstacle for the diffusion of exergy methods is the strong dependency of results on the definition of its 'reference state', a highly controversial issue. Since exergy is the combination of energy and entropy by means of a reference state (also called "reference environment", or "dead state"), the reference choice is crucial. Compared to other classical applications, buildings present two challenging elements: They operate very near to the reference state, which means that small variations have relevant impacts, and their behaviour is dynamical in nature. Not surprisingly then, the reference state definition for the built environment is still debated, especially in the case of dynamic assessments. Among the several characteristics that need to be defined, a crucial decision for a dynamic analysis is between a fixed reference environment (constant in time) and a variable state, which fluctuations follow the local climate. Even if the latter selection is prevailing in research, and recommended by recent and widely-diffused guidelines, the fixed reference has been analytically demonstrated as the only choice which defines exergy as a proper function of the state in a fluctuating environment. This study investigates the impact of that crucial choice: Fixed or variable reference. The basic element of the building energy chain, the envelope, is chosen as the object of investigation as common to any building analysis. Exergy fluctuations in the building envelope of a case study (a typical house located in a Mediterranean climate) are confronted for each time-step of a significant summer day, when the building behaviour is highly dynamical. Exergy efficiencies and fluxes are not familiar numbers, and thus, the more easy-to-imagine concept of exergy storage is used to summarize the results. Trends obtained with a fixed and a variable reference (outside air) are compared, and their meaning is discussed under the light of the underpinning dynamical energy analysis. As a conclusion, a fixed reference state is considered the best choice for dynamic exergy analysis. Even if the fixed reference is generally only contemplated as a simpler selection, and the variable state is often stated as more accurate without explicit justifications, the analytical considerations supporting the adoption of a fixed reference are confirmed by the usefulness and clarity of interpretation of its results. Further discussion is needed to address the conflict between the evidence supporting a fixed reference state and the wide adoption of a fluctuating one. A more robust theoretical framework, including selection criteria of the reference state for dynamical simulations, could push the development of integrated dynamic tools and thus spread exergy analysis for the built environment across the common practice.

Keywords: exergy, reference state, dynamic, building

Procedia PDF Downloads 210
10753 Application of Genetic Programming for Evolution of Glass-Forming Ability Parameter

Authors: Manwendra Kumar Tripathi, Subhas Ganguly

Abstract:

A few glass forming ability expressions in terms of characteristic temperatures have been proposed in the literature. Attempts have been made to correlate the expression with the critical diameter of the bulk metallic glass composition. However, with the advent of new alloys, many exceptions have been noted and reported. In the present approach, a genetic programming based code which generates an expression in terms of input variables, i.e., three characteristic temperatures viz. glass transition temperature (Tg), onset crystallization temperature (Tx) and offset temperature of melting (Tl) with maximum correlation with a critical diameter (Dmax). The expression evolved shows improved correlation with the critical diameter. In addition, the expression can be explained on the basis of time-temperature transformation curve.

Keywords: glass forming ability, genetic programming, bulk metallic glass, critical diameter

Procedia PDF Downloads 320
10752 Quantum Mechanics as a Branch of Black Hole Cosmology

Authors: U. V. S. Seshavatharam, S. Lakshminarayana

Abstract:

In a unified approach observed cosmic red shift can be re-interpreted as an index of cosmological galactic atomic light emission phenomenon. By increasing the applications of Hubble volume in cosmology as well as in quantum physics, concepts of ‘Black Hole Cosmology’ can be well-confirmed. Clearly speaking ‘quantum mechanics’ can be shown to be a branch of ‘black hole cosmology’. In Big Bang Model, confirmation of all the observations directly depend on the large scale galactic distances that are beyond human reach and raise ambiguity in all respects. The subject of modern black hole physics is absolutely theoretical. Advantage of Black hole cosmology lies in confirming its validity through the ground based atomic and nuclear experimental results.

Keywords: Hubble volume, black hole cosmology, CMBR energy density, Planck’s constant, fine structure ratio, cosmic time, nuclear charge radius, unification

Procedia PDF Downloads 555
10751 Analysis of Transformer by Gas and Moisture Sensor during Laboratory Time Monitoring

Authors: Miroslav Gutten, Daniel Korenciak, Milan Simko, Milan Chupac

Abstract:

Ensure the reliable and correct function of transformers is the main essence of on-line non-destructive diagnostic tool, which allows the accurately track of the status parameters. Devices for on-line diagnostics are very costly. However, there are devices, whose price is relatively low and when used correctly, they can be executed a complex diagnostics. One of these devices is sensor HYDRAN M2, which is used to detect the moisture and gas content in the insulation oil. Using the sensor HYDRAN M2 in combination with temperature, load measurement, and physicochemical analysis can be made the economically inexpensive diagnostic system, which use is not restricted to distribution transformers. This system was tested in educational laboratory environment at measured oil transformer 22/0.4 kV. From the conclusions referred in article is possible to determine, which kind of fault was occurred in the transformer and how was an impact on the temperature, evolution of gases and water content.

Keywords: transformer, diagnostics, gas and moisture sensor, monitoring

Procedia PDF Downloads 370
10750 An Axisymmetric Finite Element Method for Compressible Swirling Flow

Authors: Raphael Zanella, Todd A. Oliver, Karl W. Schulz

Abstract:

This work deals with the finite element approximation of axisymmetric compressible flows with swirl velocity. We are interested in problems where the flow, while weakly dependent on the azimuthal coordinate, may have a strong azimuthal velocity component. We describe the approximation of the compressible Navier-Stokes equations with H1-conformal spaces of axisymmetric functions. The weak formulation is implemented in a C++ solver with explicit time marching. The code is first verified with a convergence test on a manufactured solution. The verification is completed by comparing the numerical and analytical solutions in a Poiseuille flow case and a Taylor-Couette flow case. The code is finally applied to the problem of a swirling subsonic air flow in a plasma torch geometry.

Keywords: axisymmetric problem, compressible Navier-Stokes equations, continuous finite elements, swirling flow

Procedia PDF Downloads 161
10749 Regional Dynamics of Innovation and Entrepreneurship in the Optics and Photonics Industry

Authors: Mustafa İlhan Akbaş, Özlem Garibay, Ivan Garibay

Abstract:

The economic entities in innovation ecosystems form various industry clusters, in which they compete and cooperate to survive and grow. Within a successful and stable industry cluster, the entities acquire different roles that complement each other in the system. The universities and research centers have been accepted to have a critical role in these systems for the creation and development of innovations. However, the real effect of research institutions on regional economic growth is difficult to assess. In this paper, we present our approach for the identification of the impact of research activities on the regional entrepreneurship for a specific high-tech industry: optics and photonics. The optics and photonics has been defined as an enabling industry, which combines the high-tech photonics technology with the developing optics industry. The recent literature suggests that the growth of optics and photonics firms depends on three important factors: the embedded regional specializations in the labor market, the research and development infrastructure, and a dynamic small firm network capable of absorbing new technologies, products and processes. Therefore, the role of each factor and the dynamics among them must be understood to identify the requirements of the entrepreneurship activities in optics and photonics industry. There are three main contributions of our approach. The recent studies show that the innovation in optics and photonics industry is mostly located around metropolitan areas. There are also studies mentioning the importance of research center locations and universities in the regional development of optics and photonics industry. These studies are mostly limited with the number of patents received within a short period of time or some limited survey results. Therefore the first contribution of our approach is conducting a comprehensive analysis for the state and recent history of the photonics and optics research in the US. For this purpose, both the research centers specialized in optics and photonics and the related research groups in various departments of institutions (e.g. Electrical Engineering, Materials Science) are identified and a geographical study of their locations is presented. The second contribution of the paper is the analysis of regional entrepreneurship activities in optics and photonics in recent years. We use the membership data of the International Society for Optics and Photonics (SPIE) and the regional photonics clusters to identify the optics and photonics companies in the US. Then the profiles and activities of these companies are gathered by extracting and integrating the related data from the National Establishment Time Series (NETS) database, ES-202 database and the data sets from the regional photonics clusters. The number of start-ups, their employee numbers and sales are some examples of the extracted data for the industry. Our third contribution is the utilization of collected data to investigate the impact of research institutions on the regional optics and photonics industry growth and entrepreneurship. In this analysis, the regional and periodical conditions of the overall market are taken into consideration while discovering and quantifying the statistical correlations.

Keywords: entrepreneurship, industrial clusters, optics, photonics, emerging industries, research centers

Procedia PDF Downloads 394
10748 Load Balancing Algorithms for SIP Server Clusters in Cloud Computing

Authors: Tanmay Raj, Vedika Gupta

Abstract:

For its groundbreaking and substantial power, cloud computing is today’s most popular breakthrough. It is a sort of Internet-based computing that allows users to request and receive numerous services in a cost-effective manner. Virtualization, grid computing, and utility computing are the most widely employed emerging technologies in cloud computing, making it the most powerful. However, cloud computing still has a number of key challenges, such as security, load balancing, and non-critical failure adaption, to name a few. The massive growth of cloud computing will put an undue strain on servers. As a result, network performance will deteriorate. A good load balancing adjustment can make cloud computing more productive and in- crease client fulfillment execution. Load balancing is an important part of cloud computing because it prevents certain nodes from being overwhelmed while others are idle or have little work to perform. Response time, cost, throughput, performance, and resource usage are all parameters that may be improved using load balancing.

Keywords: cloud computing, load balancing, computing, SIP server clusters

Procedia PDF Downloads 105
10747 Data about Loggerhead Sea Turtle (Caretta caretta) and Green Turtle (Chelonia mydas) in Vlora Bay, Albania

Authors: Enerit Sacdanaku, Idriz Haxhiu

Abstract:

This study was conducted in the area of Vlora Bay, Albania. Data about Sea Turtles Caretta caretta and Chelonia mydas, belonging to two periods of time (1984–1991; 2008–2014) are given. All data gathered were analyzed using recent methodologies. For all turtles captured (as by catch), the Curve Carapace Length (CCL) and Curved Carapace Width (CCW) were measured. These data were statistically analyzed, where the mean was 67.11 cm for CCL and 57.57 cm for CCW of all individuals studied (n=13). All untagged individuals of marine turtles were tagged using metallic tags (Stockbrand’s titanium tag) with an Albanian address. Sex was determined and resulted that 45.4% of individuals were females, 27.3% males and 27.3% juveniles. All turtles were studied for the presence of the epibionts. The area of Vlora Bay is used from marine turtles (Caretta caretta) as a migratory corridor to pass from the Mediterranean to the northern part of the Adriatic Sea.

Keywords: Caretta caretta, Chelonia mydas, CCL, CCW, tagging, Vlora Bay

Procedia PDF Downloads 168
10746 A Hybrid MAC Protocol for Delay Constrained Mobile Wireless Sensor Networks

Authors: Hanefi Cinar, Musa Cibuk, Ismail Erturk, Fikri Aggun, Munip Geylani

Abstract:

Mobile Wireless Sensor Networks (MWSNs) carry heterogeneous data traffic with different urgency and quality of service (QoS) requirements. There are a lot of studies made on energy efficiency, bandwidth, and communication methods in literature. But delay, high throughput, utility parameters are not well considered. Increasing demand for real-time data transfer makes these parameters more important. In this paper we design new MAC protocol which is delay constrained and targets for improving delay, utility, and throughput performance of the network and finding solutions on collision and interference problems. Protocol improving QoS requirements by using TDMA, FDM, and OFDMA hybrid communication methods with multi-channel communication.

Keywords: MWSN, delay, hybrid MAC, TDMA, FDM, OFDMA

Procedia PDF Downloads 464
10745 Operator Efficiency Study for Assembly Line Optimization at Semiconductor Assembly and Test

Authors: Rohana Abdullah, Md Nizam Abd Rahman, Seri Rahayu Kamat

Abstract:

Operator efficiency aspect is gaining importance in ensuring optimized usage of resources especially in the semi-automated manufacturing environment. This paper addresses a case study done to solve operator efficiency and line balancing issue at a semiconductor assembly and test manufacturing. A Man-to-Machine (M2M) work study technique is used to study operator current utilization and determine the optimum allocation of the operators to the machines. Critical factors such as operator activity, activity frequency and operator competency level are considered to gain insight on the parameters that affects the operator utilization. Equipment standard time and overall equipment efficiency (OEE) information are also gathered and analyzed to achieve a balanced and optimized production.

Keywords: operator efficiency, optimized production, line balancing, industrial and manufacturing engineering

Procedia PDF Downloads 714
10744 Training for Search and Rescue Teams: Online Training for SAR Teams to Locate Lost Persons with Dementia Using Drones

Authors: Dalia Hanna, Alexander Ferworn

Abstract:

This research provides detailed proposed training modules for the public safety teams and, specifically, SAR teams responsible for search and rescue operations related to finding lost persons with dementia. Finding a lost person alive is the goal of this training. Time matters if a lost person is to be found alive. Finding lost people living with dementia is quite challenging, as they are unaware they are lost and will not seek help. Even a small contribution to SAR operations could contribute to saving a life. SAR operations will always require expert professional and human volunteers. However, we can reduce their time, save lives, and reduce costs by providing practical training that is based on real-life scenarios. The content for the proposed training is based on the research work done by the researcher in this area. This research has demonstrated that, based on utilizing drones, the algorithmic approach could support a successful search outcome. Understanding the behavior of the lost person, learning where they may be found, predicting their survivability, and automating the search are all contributions of this work, founded in theory and demonstrated in practice. In crisis management, human behavior constitutes a vital aspect in responding to the crisis; the speed and efficiency of the response often get affected by the difficulty of the context of the operation. Therefore, training in this area plays a significant role in preparing the crisis manager to manage the emotional aspects that lead to decision-making in these critical situations. Since it is crucial to gain high-level strategic choices and the ability to apply crisis management procedures, simulation exercises become central in training crisis managers to gain the needed skills to respond critically to these events. The training will enhance the responders’ ability to make decisions and anticipate possible consequences of their actions through flexible and revolutionary reasoning in responding to the crisis efficiently and quickly. As adult learners, search and rescue teams will be approaching training and learning by taking responsibility of the learning process, appreciate flexible learning and as contributors to the teaching and learning happening during that training. These are all characteristics of adult learning theories. The learner self-reflects, gathers information, collaborates with others and is self-directed. One of the learning strategies associated with adult learning is effective elaboration. It helps learners to remember information in the long term and use it in situations where it might be appropriate. It is also a strategy that can be taught easily and used with learners of different ages. Designers must design reflective activities to improve the student’s intrapersonal awareness.

Keywords: training, OER, dementia, drones, search and rescue, adult learning, UDL, instructional design

Procedia PDF Downloads 88
10743 eTransformation Framework for the Cognitive Systems

Authors: Ana Hol

Abstract:

Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.

Keywords: system implementations, AI supported systems, cognitive systems, eTransformation

Procedia PDF Downloads 223