Search results for: information Technology
639 Study of the Energy Efficiency of Buildings under Tropical Climate with a View to Sustainable Development: Choice of Material Adapted to the Protection of the Environment
Authors: Guarry Montrose, Ted Soubdhan
Abstract:
In the context of sustainable development and climate change, the adaptation of buildings to the climatic context in hot climates is a necessity if we want to improve living conditions in housing and reduce the risks to the health and productivity of occupants due to thermal discomfort in buildings. One can find a wide variety of efficient solutions but with high costs. In developing countries, especially tropical countries, we need to appreciate a technology with a very limited cost that is affordable for everyone, energy efficient and protects the environment. Biosourced insulation is a product based on plant fibers, animal products or products from recyclable paper or clothing. Their development meets the objectives of maintaining biodiversity, reducing waste and protecting the environment. In tropical or hot countries, the aim is to protect the building from solar thermal radiation, a source of discomfort. The aim of this work is in line with the logic of energy control and environmental protection, the approach is to make the occupants of buildings comfortable, reduce their carbon dioxide emissions (CO2) and decrease their energy consumption (energy efficiency). We have chosen to study the thermo-physical properties of banana leaves and sawdust, especially their thermal conductivities, direct measurements were made using the flash method and the hot plate method. We also measured the heat flow on both sides of each sample by the hot box method. The results from these different experiences show that these materials are very efficient used as insulation. We have also conducted a building thermal simulation using banana leaves as one of the materials under Design Builder software. Air-conditioning load as well as CO2 release was used as performance indicator. When the air-conditioned building cell is protected on the roof by banana leaves and integrated into the walls with solar protection of the glazing, it saves up to 64.3% of energy and avoids 57% of CO2 emissions.
Keywords: Plant fibers, tropical climates, sustainable development, waste reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 552638 Factory Virtual Environment Development for Augmented and Virtual Reality
Authors: M. Gregor, J. Polcar, P. Horejsi, M. Simon
Abstract:
Machine visualization is an area of interest with fast and progressive development. We present a method of machine visualization which will be applicable in real industrial conditions according to current needs and demands. Real factory data were obtained in a newly built research plant. Methods described in this paper were validated on a case study. Input data were processed and the virtual environment was created. The environment contains information about dimensions, structure, disposition, and function. Hardware was enhanced by modular machines, prototypes, and accessories. We added functionalities and machines into the virtual environment. The user is able to interact with objects such as testing and cutting machines, he/she can operate and move them. Proposed design consists of an environment with two degrees of freedom of movement. Users are in touch with items in the virtual world which are embedded into the real surroundings. This paper describes development of the virtual environment. We compared and tested various options of factory layout virtualization and visualization. We analyzed possibilities of using a 3D scanner in the layout obtaining process and we also analyzed various virtual reality hardware visualization methods such as: Stereoscopic (CAVE) projection, Head Mounted Display (HMD) and augmented reality (AR) projection provided by see-through glasses.
Keywords: Augmented reality, spatial scanner, virtual environment, virtual reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2059637 Real-time Performance Study of EPA Periodic Data Transmission
Authors: Liu Ning, Zhong Chongquan, Teng Hongfei
Abstract:
EPA (Ethernet for Plant Automation) resolves the nondeterministic problem of standard Ethernet and accomplishes real-time communication by means of micro-segment topology and deterministic scheduling mechanism. This paper studies the real-time performance of EPA periodic data transmission from theoretical and experimental perspective. By analyzing information transmission characteristics and EPA deterministic scheduling mechanism, 5 indicators including delivery time, time synchronization accuracy, data-sending time offset accuracy, utilization percentage of configured timeslice and non-RTE bandwidth that can be used to specify the real-time performance of EPA periodic data transmission are presented and investigated. On this basis, the test principles and test methods of the indicators are respectively studied and some formulas for real-time performance of EPA system are derived. Furthermore, an experiment platform is developed to test the indicators of EPA periodic data transmission in a micro-segment. According to the analysis and the experiment, the methods to improve the real-time performance of EPA periodic data transmission including optimizing network structure, studying self-adaptive adjustment method of timeslice and providing data-sending time offset accuracy for configuration are proposed.
Keywords: EPA system, Industrial Ethernet, Periodic data, Real-time performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469636 The Onset of Ironing during Casing Expansion
Authors: W. Assaad, D. Wilmink, H. R. Pasaribu, H. J. M. Geijselaers
Abstract:
Shell has developed a mono-diameter well concept for oil and gas wells as opposed to the traditional telescopic well design. A Mono-diameter well design allows well to have a single inner diameter from the surface all the way down to reservoir to increase production capacity, reduce material cost and reduce environmental footprint. This is achieved by expansion of liners (casing string) concerned using an expansion tool (e.g. a cone). Since the well is drilled in stages and liners are inserted to support the borehole, overlap sections between consecutive liners exist which should be expanded. At overlap, the previously inserted casing which can be expanded or unexpanded is called the host casing and the newly inserted casing is called the expandable casing. When the cone enters the overlap section, an expandable casing is expanded against a host casing, a cured cement layer and formation. In overlap expansion, ironing or lengthening may appear instead of shortening in the expandable casing when the pressure exerted by the host casing, cured cement layer and formation exceeds a certain limit. This pressure is related to cement strength, thickness of cement layer, host casing material mechanical properties, host casing thickness, formation type and formation strength. Ironing can cause implications that hinder the deployment of the technology. Therefore, the understanding of ironing becomes essential. A physical model is built in-house to calculate expansion forces, stresses, strains and post expansion casing dimensions under different conditions. In this study, only free casing and overlap expansion of two casings are addressed while the cement and formation will be incorporated in future study. Since the axial strain can be predicted by the physical model, the onset of ironing can be confirmed. In addition, this model helps in understanding ironing and the parameters influencing it. Finally, the physical model is validated with Finite Element (FE) simulations and small-scale experiments. The results of the study confirm that high pressure leads to ironing when the casing is expanded in tension mode.
Keywords: Casing expansion, cement, formation, metal forming, plasticity, well design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 779635 Influence of Microstructural Features on Wear Resistance of Biomedical Titanium Materials
Authors: Mohsin T. Mohammed, Zahid A. Khan, Arshad N. Siddiquee
Abstract:
The field of biomedical materials plays an imperative requisite and a critical role in manufacturing a variety of biological artificial replacements in a modern world. Recently, titanium (Ti) materials are being used as biomaterials because of their superior corrosion resistance and tremendous specific strength, free- allergic problems and the greatest biocompatibility compared to other competing biomaterials such as stainless steel, Co-Cr alloys, ceramics, polymers, and composite materials. However, regardless of these excellent performance properties, Implantable Ti materials have poor shear strength and wear resistance which limited their applications as biomaterials. Even though the wear properties of Ti alloys has revealed some improvements, the crucial effectiveness of biomedical Ti alloys as wear components requires a comprehensive deep understanding of the wear reasons, mechanisms, and techniques that can be used to improve wear behavior. This review examines current information on the effect of thermal and thermomechanical processing of implantable Ti materials on the long-term prosthetic requirement which related with wear behavior. This paper focuses mainly on the evolution, evaluation and development of effective microstructural features that can improve wear properties of bio grade Ti materials using thermal and thermomechanical treatments.Keywords: Wear Resistance, Heat Treatment, Thermomechanical Processing, Biomedical Titanium Materials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3663634 M2LGP: Mining Multiple Level Gradual Patterns
Authors: Yogi Satrya Aryadinata, Anne Laurent, Michel Sala
Abstract:
Gradual patterns have been studied for many years as they contain precious information. They have been integrated in many expert systems and rule-based systems, for instance to reason on knowledge such as “the greater the number of turns, the greater the number of car crashes”. In many cases, this knowledge has been considered as a rule “the greater the number of turns → the greater the number of car crashes” Historically, works have thus been focused on the representation of such rules, studying how implication could be defined, especially fuzzy implication. These rules were defined by experts who were in charge to describe the systems they were working on in order to turn them to operate automatically. More recently, approaches have been proposed in order to mine databases for automatically discovering such knowledge. Several approaches have been studied, the main scientific topics being: how to determine what is an relevant gradual pattern, and how to discover them as efficiently as possible (in terms of both memory and CPU usage). However, in some cases, end-users are not interested in raw level knowledge, and are rather interested in trends. Moreover, it may be the case that no relevant pattern can be discovered at a low level of granularity (e.g. city), whereas some can be discovered at a higher level (e.g. county). In this paper, we thus extend gradual pattern approaches in order to consider multiple level gradual patterns. For this purpose, we consider two aggregation policies, namely horizontal and vertical.Keywords: Gradual Pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500633 A Novel Receiver Algorithm for Coherent Underwater Acoustic Communications
Authors: Liang Zhao, Jianhua Ge
Abstract:
In this paper, we proposed a novel receiver algorithm for coherent underwater acoustic communications. The proposed receiver is composed of three parts: (1) Doppler tracking and correction, (2) Time reversal channel estimation and combining, and (3) Joint iterative equalization and decoding (JIED). To reduce computational complexity and optimize the equalization algorithm, Time reversal (TR) channel estimation and combining is adopted to simplify multi-channel adaptive decision feedback equalizer (ADFE) into single channel ADFE without reducing the system performance. Simultaneously, the turbo theory is adopted to form joint iterative ADFE and convolutional decoder (JIED). In JIED scheme, the ADFE and decoder exchange soft information in an iterative manner, which can enhance the equalizer performance using decoding gain. The simulation results show that the proposed algorithm can reduce computational complexity and improve the performance of equalizer. Therefore, the performance of coherent underwater acoustic communications can be improved greatly.Keywords: Underwater acoustic communication, Time reversal (TR) combining, joint iterative equalization and decoding (JIED)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724632 Assessment of Obesity Parameters in Terms of Metabolic Age above and below Chronological Age in Adults
Authors: Orkide Donma, Mustafa M. Donma
Abstract:
Chronologic age (CA) of individuals is closely related to obesity and generally affects the magnitude of obesity parameters. On the other hand, close association between basal metabolic rate (BMR) and metabolic age (MA) is also a matter of concern. It is suggested that MA higher than CA is the indicator of the need to improve the metabolic rate. In this study, the aim was to assess some commonly used obesity parameters, such as obesity degree, visceral adiposity, BMR, BMR-to-weight ratio, in several groups with varying differences between MA and CA values. The study comprises adults, whose ages vary between 18 and 79 years. Four groups were constituted. Group 1, 2, 3 and 4 were composed of 55, 33, 76 and 47 adults, respectively. The individuals exhibiting -1, 0 and +1 for their MA-CA values were involved in Group 1, which was considered as the control group. Those, whose MA-CA values varying between -5 and -10 participated in Group 2. Those, whose MAs above their real ages were divided into two groups [Group 3 (MA-CA; from +5 to + 10) and Group 4 (MA-CA; from +11 to + 12)]. Body mass index (BMI) values were calculated. TANITA body composition monitor using bioelectrical impedance analysis technology was used to obtain values for obesity degree, visceral adiposity, BMR and BMR-to-weight ratio. The compiled data were evaluated statistically using a statistical package program; SPSS. Mean ± SD values were determined. Correlation analyses were performed. The statistical significance degree was accepted as p < 0.05. The increase in BMR was positively correlated with obesity degree. MAs and CAs of the groups were 39.9 ± 16.8 vs 39.9 ± 16.7 years for Group 1, 45.0 ± 15.3 vs 51.4 ± 15.7 years for Group 2, 47.2 ± 12.7 vs 40.0 ± 12.7 years for Group 3, and 53.6 ± 14.8 vs 42 ± 14.8 years for Group 4. BMI values of the groups were 24.3 ± 3.6 kg/m2, 23.2 ± 1.7 kg/m2, 30.3 ± 3.8 kg/m2, and 40.1 ± 5.1 kg/m2 for Group 1, 2, 3 and 4, respectively. Values obtained for BMR were 1599 ± 328 kcal in Group 1, 1463 ± 198 kcal in Group 2, 1652 ± 350 kcal in Group 3, and 1890 ± 360 kcal in Group 4. A correlation was observed between BMR and MA-CA values in Group 1. No correlation was detected in other groups. On the other hand, statistically significant correlations between MA-CA values and obesity degree, BMI as well as BMR/weight were found in Group 3 and in Group 4. It was concluded that upon consideration of these findings in terms of MA-CA values, BMR-to-weight ratio was found to be much more useful indicator of the severe increase in obesity development than BMR. Also, the lack of associations between MA and BMR as well as BMR-to-weight ratio emphasize the importance of consideration of MA-CA values rather than MA.
Keywords: Basal metabolic rate, chronologic age, metabolic age, obesity degree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1051631 Human Face Detection and Segmentation using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms
Authors: J. Prakash, K. Rajesh
Abstract:
In this paper we propose a novel method for human face segmentation using the elliptical structure of the human head. It makes use of the information present in the edge map of the image. In this approach we use the fact that the eigenvalues of covariance matrix represent the elliptical structure. The large and small eigenvalues of covariance matrix are associated with major and minor axial lengths of an ellipse. The other elliptical parameters are used to identify the centre and orientation of the face. Since an Elliptical Hough Transform requires 5D Hough Space, the Circular Hough Transform (CHT) is used to evaluate the elliptical parameters. Sparse matrix technique is used to perform CHT, as it squeeze zero elements, and have only a small number of non-zero elements, thereby having an advantage of less storage space and computational time. Neighborhood suppression scheme is used to identify the valid Hough peaks. The accurate position of the circumference pixels for occluded and distorted ellipses is identified using Bresenham-s Raster Scan Algorithm which uses the geometrical symmetry properties. This method does not require the evaluation of tangents for curvature contours, which are very sensitive to noise. The method has been evaluated on several images with different face orientations.Keywords: Circular Hough Transform, Covariance matrix, Eigenvalues, Elliptical Hough Transform, Face segmentation, Raster Scan Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2517630 The Emotional Life of Patients with Chronic Diseases: A Framework for Health Promotion Strategies
Authors: Leslie Beale
Abstract:
Being a patient with a chronic disease is both a physical and emotional experience. The ability to recognize a patient’s emotional health is an important part of a health care provider’s skills. For the purposes of this paper, emotional health is viewed as the way that we feel, and the way that our feelings affect us. Understanding the patient’s emotional health leads to improved provider-patient relationships and health outcomes. For example, when a patient first hears his or her diagnosis from a provider, they might find it difficult to cope with their emotions. Struggling to cope with emotions interferes with the patient’s ability to read, understand, and act on health information and services. As a result, the patient becomes more frustrated and confused, creating barriers to accessing healthcare services. These barriers are challenging for both the patient and their healthcare providers. There are five basic emotions that are part of who we are and are always with us: fear, anger, sadness, joy, and compassion. Living with a chronic disease however can cause a patient to experience and express these emotions in new and unique ways. Within the provider-patient relationship, there needs to be an understanding that each patient experiences these five emotions and, experiences them at different times. In response to this need, the paper highlights a health promotion framework for patients with chronic disease. This framework emphasizes the emotional health of patients.
Keywords: Health promotion, emotional health, patients with chronic disease, patient-centered care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1144629 Optical Fish Tracking in Fishways using Neural Networks
Authors: Alvaro Rodriguez, Maria Bermudez, Juan R. Rabuñal, Jeronimo Puertas
Abstract:
One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Keywords: Computer Vision, Neural Network, Fishway, Fish Trajectory, Tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2001628 A Smart Monitoring System for Preventing Gas Risks in Indoor
Authors: Gyoutae Park, Geunjun Lyu, Yeonjae Lee, Wooksuk Kim, Jaheon Gu, Sanguk Ahn, Hiesik Kim
Abstract:
In this paper, we propose a system for preventing gas risks through the use of wireless communication modules and intelligent gas safety appliances. Our system configuration consists of an automatic extinguishing system, detectors, a wall-pad, and a microcomputer controlled micom gas meter to monitor gas flow and pressure as well as the occurrence of earthquakes. The automatic fire extinguishing system checks for both combustible gaseous leaks and monitors the environmental temperature, while the detector array measures smoke and CO gas concentrations. Depending on detected conditions, the micom gas meter cuts off an inner valve and generates a warning, the automatic fire-extinguishing system cuts off an external valve and sprays extinguishing materials, or the sensors generate signals and take further action when smoke or CO are detected. Information on intelligent measures taken by the gas safety appliances and sensors are transmitted to the wall-pad, which in turn relays this as real time data to a server that can be monitored via an external network (BcN) connection to a web or mobile application for the management of gas safety. To validate this smart-home gas management system, we field-tested its suitability for use in Korean apartments under several scenarios.Keywords: Gas sensor, leak, gas safety, gas meter, gas risk, wireless communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2719627 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: T. Teo, S. W. Lye, Y. F. Li, Z. Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that using a developed channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available brain computer interface (BCI) 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring, ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.
Keywords: Neurophysiology, monitoring, EEG, outliers, electroencephalography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 89626 Geospatial Assessment of State Lands in the Cape Coast Urban Area
Authors: E. B. Quarcoo, I. Yakubu, K. J. Appau
Abstract:
Current land use and land cover (LULC) dynamics in Ghana have revealed considerable changes in settlement spaces. As a result, this study is intended to merge the cellular automata and Markov chain models using remotely sensed data and Geographical Information System (GIS) approaches to monitor, map, and detect the spatio-temporal LULC change in state lands within Cape Coast Metropolis. Multi-temporal satellite images from 1986-2020 were pre-processed, geo-referenced, and then mapped using supervised maximum likelihood classification to investigate the state’s land cover history (1986-2020) with an overall mapping accuracy of approximately 85%. The study further observed the rate of change for the area to have favored the built-up area 9.8 (12.58 km2) to the detriment of vegetation 5.14 (12.68 km2), but on average, 0.37 km2 (91.43 acres, or 37.00 ha.) of the landscape was transformed yearly. Subsequently, the CA-Markov model was used to anticipate the potential LULC for the study area for 2030. According to the anticipated 2030 LULC map, the patterns of vegetation transitioning into built-up regions will continue over the following ten years as a result of urban growth.
Keywords: LULC, cellular automata, Markov Chain, state lands, urbanisation, public lands, cape coast metropolis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139625 Optimal Model Order Selection for Transient Error Autoregressive Moving Average (TERA) MRI Reconstruction Method
Authors: Abiodun M. Aibinu, Athaur Rahman Najeeb, Momoh J. E. Salami, Amir A. Shafie
Abstract:
An alternative approach to the use of Discrete Fourier Transform (DFT) for Magnetic Resonance Imaging (MRI) reconstruction is the use of parametric modeling technique. This method is suitable for problems in which the image can be modeled by explicit known source functions with a few adjustable parameters. Despite the success reported in the use of modeling technique as an alternative MRI reconstruction technique, two important problems constitutes challenges to the applicability of this method, these are estimation of Model order and model coefficient determination. In this paper, five of the suggested method of evaluating the model order have been evaluated, these are: The Final Prediction Error (FPE), Akaike Information Criterion (AIC), Residual Variance (RV), Minimum Description Length (MDL) and Hannan and Quinn (HNQ) criterion. These criteria were evaluated on MRI data sets based on the method of Transient Error Reconstruction Algorithm (TERA). The result for each criterion is compared to result obtained by the use of a fixed order technique and three measures of similarity were evaluated. Result obtained shows that the use of MDL gives the highest measure of similarity to that use by a fixed order technique.Keywords: Autoregressive Moving Average (ARMA), MagneticResonance Imaging (MRI), Parametric modeling, Transient Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615624 An Efficient and Optimized Multi Constrained Path Computation for Real Time Interactive Applications in Packet Switched Networks
Authors: P.S. Prakash, S. Selvan
Abstract:
Quality of Service (QoS) Routing aims to find path between source and destination satisfying the QoS requirements which efficiently using the network resources and underlying routing algorithm and to fmd low-cost paths that satisfy given QoS constraints. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining feasible path that satisfies a number of QoS constraints. We present a Optimized Multi- Constrained Routing (OMCR) algorithm for the computation of constrained paths for QoS routing in computer networks. OMCR applies distance vector to construct a shortest path for each destination with reference to a given optimization metric, from which a set of feasible paths are derived at each node. OMCR is able to fmd feasible paths as well as optimize the utilization of network resources. OMCR operates with the hop-by-hop, connectionless routing model in IP Internet and does not create any loops while fmding the feasible paths. Nodes running OMCR not necessarily maintaining global view of network state such as topology, resource information and routing updates are sent only to neighboring nodes whereas its counterpart link-state routing method depend on complete network state for constrained path computation and that incurs excessive communication overhead.
Keywords: QoS Routing, Optimization, feasible path, multiple constraints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125623 Interoperability and Performance Analysis of IEC61850 Based Substation Protection System
Authors: Ming-Ta Yang, Jyh-Cherng Gu, Po-Chun Lin, Yen-Lin Huang, Chun-Wei Huang, Jin-Lung Guan
Abstract:
Since IEC61850 substation communication standard represents the trend to develop new generations of Substation Automation System (SAS), many IED manufacturers pursue this technique and apply for KEMA. In order to put on the market to meet customer demand as fast as possible, manufacturers often apply their products only for basic environment standard certification but claim to conform to IEC61850 certification. Since verification institutes generally perform verification tests only on specific IEDs of the manufacturers, the interoperability between all certified IEDs cannot be guaranteed. Therefore the interoperability between IEDs from different manufacturers needs to be tested. Based upon the above reasons, this study applies the definitions of the information models, communication service, GOOSE functionality and Substation Configuration Language (SCL) of the IEC61850 to build the concept of communication protocols, and build the test environment. The procedures of the test of the data collection and exchange of the P2P communication mode and Client / Server communication mode in IEC61850 are outlined as follows. First, test the IED GOOSE messages communication capability from different manufacturers. Second, collect IED data from each IED with SCADA system and use HMI to display the SCADA platform. Finally, problems generally encountered in the test procedure are summarized.Keywords: GOOSE, IEC61850, IED, SCADA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5368622 The Management in Large Emergency Situations – A Best Practise Case Study based on GIS for Management of Evacuation
Authors: Ion Baş, Claudiu Zoicaş, Angela Ioniţâ
Abstract:
In most of the cases, natural disasters lead to the necessity of evacuating people. The quality of evacuation management is dramatically improved by the use of information provided by decision support systems, which become indispensable in case of large scale evacuation operations. This paper presents a best practice case study. In November 2007, officers from the Emergency Situations Inspectorate “Crisana" of Bihor County from Romania participated to a cross-border evacuation exercise, when 700 people have been evacuated from Netherlands to Belgium. One of the main objectives of the exercise was the test of four different decision support systems. Afterwards, based on that experience, software system called TEVAC (Trans Border Evacuation) has been developed “in house" by the experts of this institution. This original software system was successfully tested in September 2008, during the deployment of the international exercise EU-HUROMEX 2008, the scenario involving real evacuation of 200 persons from Hungary to Romania. Based on the lessons learned and results, starting from April 2009, the TEVAC software is used by all Emergency Situations Inspectorates all over Romania.Keywords: Emergency evacuation, Searching Features, TEVAC(Trans Border Evacuation) software system, User Interface Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583621 Visitors’ Attitude towards the Service Marketing Mix and Frequency of Visits to Bangpu Recreation Centre, Thailand
Authors: Siri-Orn Champatong
Abstract:
This research paper was aimed to examine the relationship between visitors’ attitude towards the service marketing mix and visitors’ frequency of visit to Bangpu Recreation Centre. Based on a large and uncalculated population, the number of samples was calculated according to the formula to obtain a total of 385 samples. In collecting the samples, systematic random sampling was applied and by using of a Likert five-scale questionnaire for, a total of 21 days to collect the needed information. Mean, Standard Deviation, and Pearson’s basic statistical correlations were utilized in analyzing the data. This study discovered a high level of visitors’ attitude product and service of Bangpu Recreation Centre, price, place, promotional activities, people who provided service and physical evidence of the centre. The attitude towards process of service was discovered to be at a medium level. Additionally, the finding of an examination of a relationship between visitors’ attitude towards service marketing mix and visitors’ frequency of visit to Bangpu Recreation Centre presented that product and service, people, physical evidence and process of service provision showed a relationship with the visitors’ frequency of visit to the centre per year.
Keywords: Frequency of Visit, Visitor, Service Marketing Mix, Bangpu Recreation Centre.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784620 Graph Codes-2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally de-signed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, espe-cially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelisation. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.
Keywords: indexing, retrieval, multimedia, graph code, graph algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 444619 Multi-Hazard Risk Assessment and Management in Tourism Industry- A Case Study from the Island of Taiwan
Authors: Chung-Hung Tsai
Abstract:
Global environmental changes lead to increased frequency and scale of natural disaster, Taiwan is under the influence of global warming and extreme weather. Therefore, the vulnerability was increased and variability and complexity of disasters is relatively enhanced. The purpose of this study is to consider the source and magnitude of hazard characteristics on the tourism industry. Using modern risk management concepts, integration of related domestic and international basic research, this goes beyond the Taiwan typhoon disaster risk assessment model and evaluation of loss. This loss evaluation index system considers the impact of extreme weather, in particular heavy rain on the tourism industry in Taiwan. Consider the extreme climate of the compound impact of disaster for the tourism industry; we try to make multi-hazard risk assessment model, strategies and suggestions. Related risk analysis results are expected to provide government department, the tourism industry asset owners, insurance companies and banking include tourist disaster risk necessary information to help its tourism industry for effective natural disaster risk management.
Keywords: Tourism industry, extreme weather, multi-hazard, vulnerability analysis, loss exceeding probability, risk management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3830618 The Robust Clustering with Reduction Dimension
Authors: Dyah E. Herwindiati
Abstract:
A clustering is process to identify a homogeneous groups of object called as cluster. Clustering is one interesting topic on data mining. A group or class behaves similarly characteristics. This paper discusses a robust clustering process for data images with two reduction dimension approaches; i.e. the two dimensional principal component analysis (2DPCA) and principal component analysis (PCA). A standard approach to overcome this problem is dimension reduction, which transforms a high-dimensional data into a lower-dimensional space with limited loss of information. One of the most common forms of dimensionality reduction is the principal components analysis (PCA). The 2DPCA is often called a variant of principal component (PCA), the image matrices were directly treated as 2D matrices; they do not need to be transformed into a vector so that the covariance matrix of image can be constructed directly using the original image matrices. The decomposed classical covariance matrix is very sensitive to outlying observations. The objective of paper is to compare the performance of robust minimizing vector variance (MVV) in the two dimensional projection PCA (2DPCA) and the PCA for clustering on an arbitrary data image when outliers are hiden in the data set. The simulation aspects of robustness and the illustration of clustering images are discussed in the end of paperKeywords: Breakdown point, Consistency, 2DPCA, PCA, Outlier, Vector Variance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697617 Analysis of Factors Used by Farmers to Manage Risk: A Case Study on Italian Farms
Authors: A. Pontrandolfi, G. Enjolras, F. Capitanio
Abstract:
The study analyses the strategies Italian farmers use to cope with the risks that face their production. We specifically explore the potential and the limitations of the economic tools for climatic risk management in agriculture of the Common Agricultural Policy 2014-2020, that foresees contributions for economic tools for risk management, in relation to farms’ needs, exposure and vulnerability of agricultural areas to climatic risk. We consider at the farm level approaches to hedge risks in terms of the use of technical tools (agricultural practices, pesticides, fertilizers, irrigation) and economic/financial instruments (insurances, etc.). We develop cross-sectional and longitudinal analyses as well as analyses of correlation that underline the main differences between the way farms adapt their structure and management towards risk. The results show a preference for technical tools, despite the presence of important public aids on economic tools such as insurances. Therefore, there is a strong need for a more effective and integrated risk management policy scheme. Synergies between economic tools and risk reduction actions of a more technical, structural and management nature (production diversification, irrigation infrastructures, technological and management innovations and formation-information-consultancy, etc.) are emphasized.Keywords: Agriculture and climate change, climatic risk management, insurance schemes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217616 Design of Multiple Clouds Based Global Performance Evaluation Service Broker System
Authors: Dong-Jae Kang, Nam-Woo Kim, Duk-Joo Son, Sung-In Jung
Abstract:
According to dramatic growth of internet services, an easy and prompt service deployment has been important for internet service providers to successfully maintain time-to-market. Before global service deployment, they have to pay the big cost for service evaluation to make a decision of the proper system location, system scale, service delay and so on. But, intra-Lab evaluation tends to have big gaps in the measured data compared with the realistic situation, because it is very difficult to accurately expect the local service environment, network congestion, service delay, network bandwidth and other factors. Therefore, to resolve or ease the upper problems, we propose multiple cloud based GPES Broker system and use case that helps internet service providers to alleviate the above problems in beta release phase and to make a prompt decision for their service launching. By supporting more realistic and reliable evaluation information, the proposed GPES Broker system saves the service release cost and enables internet service provider to make a prompt decision about their service launching to various remote regions.
Keywords: GPES Broker system, Cloud Service Broker, Multiple Cloud, Global performance evaluation service (GPES), Service provisioning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047615 Satellite Sensing for Evaluation of an Irrigation System in Cotton - Wheat Zone
Authors: Sadia Iqbal, Faheem Iqbal, Furqan Iqbal
Abstract:
Efficient utilization of existing water is a pressing need for Pakistan. Due to rising population, reduction in present storage capacity and poor delivery efficiency of 30 to 40% from canal. A study to evaluate an irrigation system in the cotton-wheat zone of Pakistan, after the watercourse lining was conducted. The study is made on the basis of cropping pattern and salinity to evaluate the system. This study employed an index-based approach of using Geographic information system with field data. The satellite images of different years were use to examine the effective area. Several combinations of the ratio of signals received in different spectral bands were used for development of this index. Near Infrared and Thermal IR spectral bands proved to be most effective as this combination helped easy detection of salt affected area and cropping pattern of the study area. Result showed that 9.97% area under salinity in 1992, 9.17% in 2000 and it left 2.29% in year 2005. Similarly in 1992, 45% area is under vegetation it improves to 56% and 65% in 2000 and 2005 respectively. On the basis of these results evaluation is done 30% performance is increase after the watercourse improvement.Keywords: Salinity, remote sensing index, salinity index, cropping pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678614 A Study on the Application of Machine Learning and Deep Learning Techniques for Skin Cancer Detection
Authors: Hritwik Ghosh, Irfan Sadiq Rahat, Sachi Nandan Mohanty, J. V. R. Ravindra, Abdus Sobur
Abstract:
In the rapidly evolving landscape of medical diagnostics, the early detection and accurate classification of skin cancer remain paramount for effective treatment outcomes. This research delves into the transformative potential of artificial intelligence (AI), specifically deep learning (DL), as a tool for discerning and categorizing various skin conditions. Utilizing a diverse dataset of 3,000 images, representing nine distinct skin conditions, we confront the inherent challenge of class imbalance. This imbalance, where conditions like melanomas are over-represented, is addressed by incorporating class weights during the model training phase, ensuring an equitable representation of all conditions in the learning process. Our approach presents a hybrid model, amalgamating the strengths of two renowned convolutional neural networks (CNNs), VGG16 and ResNet50. These networks, pre-trained on the ImageNet dataset, are adept at extracting intricate features from images. By synergizing these models, our research aims to capture a holistic set of features, thereby bolstering classification performance. Preliminary findings underscore the hybrid model's superiority over individual models, showcasing its prowess in feature extraction and classification. Moreover, the research emphasizes the significance of rigorous data pre-processing, including image resizing, color normalization, and segmentation, in ensuring data quality and model reliability. In essence, this study illuminates the promising role of AI and DL in revolutionizing skin cancer diagnostics, offering insights into its potential applications in broader medical domains.
Keywords: Artificial intelligence, machine learning, deep learning, skin cancer, dermatology, convolutional neural networks, image classification, computer vision, healthcare technology, cancer detection, medical imaging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1451613 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: Crime prediction, machine learning, public safety, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1326612 Leveraging Li-Fi to Enhance Security and Performance of Medical Devices
Authors: Trevor Kroeger, Hayden Williams, Edward Holzinger, David Coleman, Brian Haberman
Abstract:
The network connectivity of medical devices is increasing at a rapid rate. Many medical devices, such as vital sign monitors, share information via wireless or wired connections. However, these connectivity options suffer from a variety of well-known limitations. Wireless connectivity, especially in the unlicensed radio frequency bands, can be disrupted. Such disruption could be due to benign reasons, such as a crowded spectrum, or to malicious intent. While wired connections are less susceptible to interference, they inhibit the mobility of the medical devices, which could be critical in a variety of scenarios. This work explores the application of Light Fidelity (Li-Fi) communication to enhance the security, performance, and mobility of medical devices in connected healthcare scenarios. A simple bridge for connected devices serves as an avenue to connect traditional medical devices to the Li-Fi network. This bridge was utilized to conduct bandwidth tests on a small Li-Fi network installed into a Mock-ICU setting with a backend enterprise network similar to that of a hospital. Mobile and stationary tests were conducted to replicate various different situations that might occur within a hospital setting. Results show that in room Li-Fi connectivity provides reasonable bandwidth and latency within a hospital like setting.Keywords: Hospital, light fidelity, Li-Fi, medical devices, security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 614611 A Novel Approach for Protein Classification Using Fourier Transform
Authors: A. F. Ali, D. M. Shawky
Abstract:
Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.
Keywords: Bioinformatics, Artificial Neural Networks, Protein Sequence Analysis, Feature Extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360610 Landslide Susceptibility Mapping: A Comparison between Logistic Regression and Multivariate Adaptive Regression Spline Models in the Municipality of Oudka, Northern of Morocco
Authors: S. Benchelha, H. C. Aoudjehane, M. Hakdaoui, R. El Hamdouni, H. Mansouri, T. Benchelha, M. Layelmam, M. Alaoui
Abstract:
The logistic regression (LR) and multivariate adaptive regression spline (MarSpline) are applied and verified for analysis of landslide susceptibility map in Oudka, Morocco, using geographical information system. From spatial database containing data such as landslide mapping, topography, soil, hydrology and lithology, the eight factors related to landslides such as elevation, slope, aspect, distance to streams, distance to road, distance to faults, lithology map and Normalized Difference Vegetation Index (NDVI) were calculated or extracted. Using these factors, landslide susceptibility indexes were calculated by the two mentioned methods. Before the calculation, this database was divided into two parts, the first for the formation of the model and the second for the validation. The results of the landslide susceptibility analysis were verified using success and prediction rates to evaluate the quality of these probabilistic models. The result of this verification was that the MarSpline model is the best model with a success rate (AUC = 0.963) and a prediction rate (AUC = 0.951) higher than the LR model (success rate AUC = 0.918, rate prediction AUC = 0.901).
Keywords: Landslide susceptibility mapping, regression logistic, multivariate adaptive regression spline, Oudka, Taounate, Morocco.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 989