Search results for: data safety.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8058

Search results for: data safety.

6858 Parallel Computation of Data Summation for Multiple Problem Spaces on Partitioned Optical Passive Stars Network

Authors: Khin Thida Latt, Mineo Kaneko, Yoichi Shinoda

Abstract:

In Partitioned Optical Passive Stars POPS network,nodes and couplers become free after slot to slot in some computation.It is necessary to efficiently utilize free couplers and nodes to be cost effective. Improving parallelism, we present the fast data summation algorithm for multiple problem spaces on P OP S(g, g) with smaller number of nodes for the case of d =n = g. For the case of d >n > g, we simulate the calculation of large number of data items dedicated to larger system with many nodes on smaller system with smaller number of nodes. The algorithm is faster than the best know algorithm and using smaller number of nodes and groups make the system low cost and practical.

Keywords: Partitioned optical passive stars network, parallelcomputing, optical computing, data sum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1183
6857 Use of Persuasive Technology to Change End-Users- IT Security Aware Behaviour: A Pilot Study

Authors: Ai Cheo Yeo, Md. Mahbubur Rahim, Yin Ying Ren

Abstract:

Persuasive technology has been applied in marketing, health, environmental conservation, safety and other domains and is found to be quite effective in changing people-s attitude and behaviours. This research extends the application domains of persuasive technology to information security awareness and uses a theory-driven approach to evaluate the effectiveness of a web-based program developed based on the principles of persuasive technology to improve the information security awareness of end users. The findings confirm the existence of a very strong effect of the webbased program in raising users- attitude towards information security aware behavior. This finding is useful to the IT researchers and practitioners in developing appropriate and effective education strategies for improving the information security attitudes for endusers.

Keywords: Information security, persuasive technology, ITsecurity-aware behaviour, theory of planned behaviour survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2410
6856 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction

Authors: S. Anastasiou, C. Nathanailides

Abstract:

The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services of banks were gathered from relevant published works which included data from five different countries. The scores of customers and employees satisfaction of the different published works were transformed and normalized to the scale of 1 to 100. The data were analyzed and a regression analysis of the two parameters was used to describe the link between employee’s satisfaction and customer’s satisfaction. Assuming that employee satisfaction has a significant influence on customer’s service and the resulting customer satisfaction, the reviewed data indicate that employee’s satisfaction contributes significantly on the level of customer satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05). The reviewed data indicate that published data support the hypothesis that practical evidence link these two parameters. During the recent global economic crisis, the financial services sector was affected severely and job security, remuneration and recruitment of personnel of banks was in many countries, including Greece, significantly reduced. Nevertheless, modern organizations should always consider their personnel as a capital, which is the driving force for success in the future. Appropriate human resource management policies can increase the level of job satisfaction of the personnel with positive consequences for the level of customer’s satisfaction.

Keywords: Job satisfaction, job performance, customer service, banks, human resources management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5126
6855 A New Approach of Fuzzy Methods for Evaluating of Hydrological Data

Authors: Nasser Shamskia, Seyyed Habib Rahmati, Hassan Haleh , Seyyedeh Hoda Rahmati

Abstract:

The main criteria of designing in the most hydraulic constructions essentially are based on runoff or discharge of water. Two of those important criteria are runoff and return period. Mostly, these measures are calculated or estimated by stochastic data. Another feature in hydrological data is their impreciseness. Therefore, in order to deal with uncertainty and impreciseness, based on Buckley-s estimation method, a new fuzzy method of evaluating hydrological measures are developed. The method introduces triangular shape fuzzy numbers for different measures in which both of the uncertainty and impreciseness concepts are considered. Besides, since another important consideration in most of the hydrological studies is comparison of a measure during different months or years, a new fuzzy method which is consistent with special form of proposed fuzzy numbers, is also developed. Finally, to illustrate the methods more explicitly, the two algorithms are tested on one simple example and a real case study.

Keywords: Fuzzy Discharge, Fuzzy estimation, Fuzzy ranking method, Hydrological data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
6854 A Tree Based Association Rule Approach for XML Data with Semantic Integration

Authors: D. Sasikala, K. Premalatha

Abstract:

The use of eXtensible Markup Language (XML) in web, business and scientific databases lead to the development of methods, techniques and systems to manage and analyze XML data. Semi-structured documents suffer due to its heterogeneity and dimensionality. XML structure and content mining represent convergence for research in semi-structured data and text mining. As the information available on the internet grows drastically, extracting knowledge from XML documents becomes a harder task. Certainly, documents are often so large that the data set returned as answer to a query may also be very big to convey the required information. To improve the query answering, a Semantic Tree Based Association Rule (STAR) mining method is proposed. This method provides intentional information by considering the structure, content and the semantics of the content. The method is applied on Reuter’s dataset and the results show that the proposed method outperforms well.

Keywords: Semi--structured Document, Tree based Association Rule (TAR), Semantic Association Rule Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2357
6853 A Virtual Grid Based Energy Efficient Data Gathering Scheme for Heterogeneous Sensor Networks

Authors: Siddhartha Chauhan, Nitin Kumar Kotania

Abstract:

Traditional Wireless Sensor Networks (WSNs) generally use static sinks to collect data from the sensor nodes via multiple forwarding. Therefore, network suffers with some problems like long message relay time, bottle neck problem which reduces the performance of the network.

Many approaches have been proposed to prevent this problem with the help of mobile sink to collect the data from the sensor nodes, but these approaches still suffer from the buffer overflow problem due to limited memory size of sensor nodes. This paper proposes an energy efficient scheme for data gathering which overcomes the buffer overflow problem. The proposed scheme creates virtual grid structure of heterogeneous nodes. Scheme has been designed for sensor nodes having variable sensing rate. Every node finds out its buffer overflow time and on the basis of this cluster heads are elected. A controlled traversing approach is used by the proposed scheme in order to transmit data to sink. The effectiveness of the proposed scheme is verified by simulation.

Keywords: Buffer overflow problem, Mobile sink, Virtual grid, Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
6852 Flexural Strength of Alkali Resistant Glass Textile Reinforced Concrete Beam with Prestressing

Authors: Jongho Park, Taekyun Kim, Jungbhin You, Sungnam Hong, Sun-Kyu Park

Abstract:

Due to the aging of bridges, increasing of maintenance costs and decreasing of structural safety is occurred. The steel corrosion of reinforced concrete bridge is the most common problem and this phenomenon is accelerating due to abnormal weather and increasing CO2 concentration due to climate change. To solve these problems, composite members using textile have been studied. A textile reinforced concrete can reduce carbon emissions by reduced concrete and without steel bars, so a lot of structural behavior studies are needed. Therefore, in this study, textile reinforced concrete beam was made and flexural test was performed. Also, the change of flexural strength according to the prestressing was conducted. As a result, flexural strength of TRC with prestressing was increased compared and flexural behavior was shown as reinforced concrete.

Keywords: AR-glass, flexural strength, prestressing, textile reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1188
6851 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: Data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
6850 Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: Housing data, feature selection, random forest, Boruta algorithm, root mean square error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
6849 Data Placement in Heterogeneous Storage of Short Videos

Authors: W. Jaipahkdee, C. Srinilta

Abstract:

The overall service performance of I/O intensive system depends mainly on workload on its storage system. In heterogeneous storage environment where storage elements from different vendors with different capacity and performance are put together, workload should be distributed according to storage capability. This paper addresses data placement issue in short video sharing website. Workload contributed by a video is estimated by the number of views and life time span of existing videos in same category. Experiment was conducted on 42,000 video titles in six weeks. Result showed that the proposed algorithm distributed workload and maintained balance better than round robin and random algorithms.

Keywords: data placement, heterogeneous storage system, YouTube, short videos

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
6848 Effect of Concrete Nonlinear Parameters on the Seismic Response of Concrete Gravity Dams

Authors: Z. Heirany, M. Ghaemian

Abstract:

Behavior of dams against the seismic loads has been studied by many researchers. Most of them proposed new numerical methods to investigate the dam safety. In this paper, to study the effect of nonlinear parameters of concrete in gravity dams, a twodimensional approach was used including the finite element method, staggered method and smeared crack approach. Effective parameters in the models are physical properties of concrete such as modulus of elasticity, tensile strength and specific fracture energy. Two different models were used in foundation (mass-less and massed) in order to determine the seismic response of concrete gravity dams. Results show that when the nonlinear analysis includes the dam- foundation interaction, the foundation-s mass, flexibility and radiation damping are important in gravity dam-s response.

Keywords: Numerical methods; concrete gravity dams; finiteelement method; boundary condition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
6847 The Basic Problems for the Realization of the Concept of Economic Policy

Authors: R. Gvelesiani, I. Gogorishvili

Abstract:

The concept of economic policy and the practical economic policy diverge from each other at a certain stage of development. This is related to the concept of realization of the underlying problems. It comes with all the problems emerged in the market oriented economic order due to the political processes based on social welfare policy. The realization of the concept of economic policy is impeded by economic and political obstacles. If you want to fill the appeared between the concept and reality, it is necessary to identify and avoid these obstacles. It requires the following: increase of the level of the knowledge of prevention technology in understanding of economic relations, as well as political aspects of the formation of ideas; perfection of economic policy toolkit, and political methods. It is necessary to realize what is the main precondition of implementation and further development of the concept of economic policy, as well as the formation of opinions about economic and public safety. This is a broad consensus on the basic values of the content and the scale of action, which the general public wants to be realized.

Keywords: Economic Policy, Basic Problems, Social Welfare Policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
6846 Malicious Route Defending Reliable-Data Transmission Scheme for Multi Path Routing in Wireless Network

Authors: S. Raja Ratna, R. Ravi

Abstract:

Securing the confidential data transferred via wireless network remains a challenging problem. It is paramount to ensure that data are accessible only by the legitimate users rather than by the attackers. One of the most serious threats to organization is jamming, which disrupts the communication between any two pairs of nodes. Therefore, designing an attack-defending scheme without any packet loss in data transmission is an important challenge. In this paper, Dependence based Malicious Route Defending DMRD Scheme has been proposed in multi path routing environment to prevent jamming attack. The key idea is to defend the malicious route to ensure perspicuous transmission. This scheme develops a two layered architecture and it operates in two different steps. In the first step, possible routes are captured and their agent dependence values are marked using triple agents. In the second step, the dependence values are compared by performing comparator filtering to detect malicious route as well as to identify a reliable route for secured data transmission. By simulation studies, it is observed that the proposed scheme significantly identifies malicious route by attaining lower delay time and route discovery time; it also achieves higher throughput.

Keywords: Attacker, Dependence, Jamming, Malicious.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1754
6845 Directing the Forensic Investigation of a Catastrophic Structure Collapse: The Jacksonville Parking Garage Collapse

Authors: W. C. Bracken

Abstract:

This paper discusses the forensic investigation of a fatality-involved catastrophic structure collapse and the special challenges faced when tasked with directing such an effort. While this paper discusses the investigation’s findings and the outcome of the event; this paper’s primary focus is on the challenges faced directing a forensic investigation that requires coordinating with governmental oversight while also having to accommodate multiple parties’ investigative teams. In particular the challenges discussed within this paper included maintaining on-site safety and operations while accommodating outside investigator’s interests. In addition this paper discusses unique challenges that one may face such as what to do about unethical conduct of interested party’s investigative teams, “off the record” sharing of information, and clandestinely transmitted evidence.

Keywords: Catastrophic structure collapse, collapse investigation, Jacksonville parking garage collapse, forensic investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2102
6844 EDULOGIC+ - Knowledge Management through Data Analysis in Education

Authors: Alok Sharma, Dr. Harvinder S. Saini, Raviteja Tiruvury

Abstract:

This paper outlines the application of Knowledge Management (KM) principles in the context of Educational institutions. The paper caters to the needs of the engineering institutions for imparting quality education by delineating the instruction delivery process in a highly structured, controlled and quantified manner. This is done using a software tool EDULOGIC+. The central idea has been based on the engineering education pattern in Indian Universities/ Institutions. The data, contents and results produced over contiguous years build the necessary ground for managing the related accumulated knowledge. Application of KM has been explained using certain examples of data analysis and knowledge extraction.

Keywords: Education software system, information system, knowledge management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
6843 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: Hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 771
6842 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field

Authors: Nastaran Moosavi, Mohammad Mokhtari

Abstract:

Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.

Keywords: Density, P-impedance, S-impedance, post-stack seismic inversion, pre-stack seismic inversion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2231
6841 A Decision Support System for Predicting Hospitalization of Hemodialysis Patients

Authors: Jinn-Yi Yeh, Tai-Hsi Wu

Abstract:

Hemodialysis patients might suffer from unhealthy care behaviors or long-term dialysis treatments. Ultimately they need to be hospitalized. If the hospitalization rate of a hemodialysis center is high, its quality of service would be low. Therefore, how to decrease hospitalization rate is a crucial problem for health care. In this study we combined temporal abstraction with data mining techniques for analyzing the dialysis patients' biochemical data to develop a decision support system. The mined temporal patterns are helpful for clinicians to predict hospitalization of hemodialysis patients and to suggest them some treatments immediately to avoid hospitalization.

Keywords: Hemodialysis, Temporal abstract, Data mining, Healthcare quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
6840 A Human Activity Recognition System Based On Sensory Data Related to Object Usage

Authors: M. Abdullah-Al-Wadud

Abstract:

Sensor-based Activity Recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian-based classification, Activity recognition, sensor data, object-usage model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1827
6839 Dominating Set Algorithm and Trust Evaluation Scheme for Secured Cluster Formation and Data Transferring

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

This paper describes the proficient way of choosing the cluster head based on dominating set algorithm in a wireless sensor network (WSN). The algorithm overcomes the energy deterioration problems by this selection process of cluster heads. Clustering algorithms such as LEACH, EEHC and HEED enhance scalability in WSNs. Dominating set algorithm keeps the first node alive longer than the other protocols previously used. As the dominating set of cluster heads are directly connected to each node, the energy of the network is saved by eliminating the intermediate nodes in WSN. Security and trust is pivotal in network messaging. Cluster head is secured with a unique key. The member can only connect with the cluster head if and only if they are secured too. The secured trust model provides security for data transmission in the dominated set network with the group key. The concept can be extended to add a mobile sink for each or for no of clusters to transmit data or messages between cluster heads and to base station. Data security id preferably high and data loss can be prevented. The simulation demonstrates the concept of choosing cluster heads by dominating set algorithm and trust evaluation using DSTE. The research done is rationalized.

Keywords: Wireless Sensor Networks, LEECH, EEHC, HEED, DSTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
6838 Computer - based Systems for High Speed Vessels Navigators – Engineers Training

Authors: D. E. Gourgoulis, C. G. Yakinthos, M. G. Vassiliadou

Abstract:

With high speed vessels getting ever more sophisti-cated, travelling at higher and higher speeds and operating in With high speed vessels getting ever more sophisticated, travelling at higher and higher speeds and operating in areas of high maritime traffic density, training becomes of the highest priority to ensure that safety levels are maintained, and risks are adequately mitigated. Training onboard the actual craft on the actual route still remains the most effective way for crews to gain experience. However, operational experience and incidents during the last 10 years demonstrate the need for supplementary training whether in the area of simulation or man to man, man/ machine interaction. Training and familiarisation of the crew is the most important aspect in preventing incidents. The use of simulator, computer and web based training systems in conjunction with onboard training focusing on critical situations will improve the man machine interaction and thereby reduce the risk of accidents. Today, both ship simulator and bridge teamwork courses are now becoming the norm in order to improve further emergency response and crisis management skills. One of the main causes of accidents is the human factor. An efficient way to reduce human errors is to provide high-quality training to the personnel and to select the navigators carefully.areas of high maritime traffic density, training becomes of the highest priority to ensure that safety levels are maintained, and risks are adequately mitigated. Training onboard the actual craft on the actual route still remains the most effective way for crews to gain experience. How-ever, operational experience and incidents during the last 10 years demonstrate the need for supplementary training whether in the area of simulation or man to man, man/ machine interaction. Training and familiarisation of the crew is the most important aspect in preventing incidents. The use of simulator, computer and web based training systems in conjunction with onboard training focusing on critical situations will improve the man machine interaction and thereby reduce the risk of accidents. Today, both ship simulator and bridge teamwork courses are now becoming the norm in order to improve further emergency response and crisis management skills. One of the main causes of accidents is the human factor. An efficient way to reduce human errors is to provide high-quality training to the person-nel and to select the navigators carefully. KeywordsCBT - WBT systems, Human factors.

Keywords: CBT - WBT systems, Human factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
6837 Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data

Authors: Kozak K, M. Kozak, K. Stapor

Abstract:

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.

Keywords: biological screening, kernel methods, KNN, QSAR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2278
6836 The Use of Real Measurements and GPS Data for Noise Mapping of Riyadh City

Authors: M. A. Foda, K. A. Alsaif, M. M. ElMadany, A.S. Aguib

Abstract:

In this paper, the noise maps for the area encircled by the Second Ring Road in Riyadh city are developed based on real measured data. Sound level meters, GPS receivers to determine measurement position, a database program to manage the measured data, and a program to develop the maps are used. A baseline noise level has been established at each short-term site so subsequent monitoring may be conducted to describe changes in Riyadh-s noise environment. Short-term sites are used to show typical daytime and nighttime noise levels at specific locations by short duration grab sampling.

Keywords: Noise mapping, Noise measurements, GPS, noise level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2163
6835 Transceiver for Differential Wave Pipe-Lined Serial Interconnect with Surfing

Authors: Bhaskar M., Venkataramani B.

Abstract:

In the literature, surfing technique has been proposed for single ended wave-pipelined serial interconnects to increase the data transfer rate. In this paper a novel surfing technique is proposed for differential wave-pipelined serial interconnects, which uses a 'Controllable inverter pair' for surfing. To evaluate the efficiency of this technique, a transceiver with transmitter, receiver, delay locked loop (DLL) along with 40mm metal 4 interconnects using the proposed surfing technique is implemented in UMC 180nm technology and their performances are studied through post layout simulations. From the study, it is observed that the proposed scheme permits 1.875 times higher data transmission rate compared to the single ended scheme whose maximum data transfer rate is 1.33 GB/s. The proposed scheme has the ability to receive the correct data even with stuck-at-faults in the complementary line.

Keywords: Controllable inverter pair, differential interconnect, serial link, surfing, wave pipelining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
6834 The Analysis of TRACE/PARCS in the Simulation of Ultimate Response Guideline for Lungmen ABWR

Authors: J. R. Wang, W.Y. Li, H.T. Lin, B.H. Lee, C. Shih, S.W. Chen

Abstract:

In this research, the TRACE/PARCS model of  Lungmen ABWR has been developed for verification of ultimate  response guideline (URG) efficiency. This ultimate measure was  named as DIVing plan, abbreviated from system depressurization,  water injection and containment venting. The simulation initial  condition is 100% rated power/100% rated core flow. This research  focuses on the estimation of the time when the fuel might be damaged  with no water injection by using TRACE/PARCS first. Then, the  effect of the reactor core isolation system (RCIC), control  depressurization and ac-independent water addition system (ACIWA),  which can provide the injection with 950 gpm are also estimated for  the station blackout (SBO) transient.

 

Keywords: ABWR, TRACE, safety analysis, PARCS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2224
6833 A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)

Authors: Akram M. Zeki, Azizah A. Manaf

Abstract:

Least Significant Bit (LSB) technique is the earliest developed technique in watermarking and it is also the most simple, direct and common technique. It essentially involves embedding the watermark by replacing the least significant bit of the image data with a bit of the watermark data. The disadvantage of LSB is that it is not robust against attacks. In this study intermediate significant bit (ISB) has been used in order to improve the robustness of the watermarking system. The aim of this model is to replace the watermarked image pixels by new pixels that can protect the watermark data against attacks and at the same time keeping the new pixels very close to the original pixels in order to protect the quality of watermarked image. The technique is based on testing the value of the watermark pixel according to the range of each bit-plane.

Keywords: Watermarking, LSB, ISB, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
6832 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: Decision tree, classification, data mining, scholarship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2163
6831 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey

Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff

Abstract:

This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.

Keywords: Cruise behavior, on-board environmental factors, on-board experience, user or customer satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878
6830 Development of Autonomous Cable Inspection Robot for Nuclear Power Plant

Authors: Jae-Kyung LEE, Byung-Hak CHO, Kyung-Nam Jang, Sun-Chul Jung, Ki-Yong OH, Joon-Young PARK, Jong-Seog Kim

Abstract:

The cables in a nuclear power plant are designed to be used for about 40 years in safe operation environment. However, the heat and radiation in the nuclear power plant causes the rapid performance deterioration of cables in nuclear vessels and heat exchangers, which requires cable lifetime estimation. The most accurate method of estimating the cable lifetime is to evaluate the cables in a laboratory. However, removing cables while the plant is operating is not allowed because of its safety and cost. In this paper, a robot system to estimate the cable lifetime in nuclear power plants is developed and tested. The developed robot system can calculate a modulus value to estimate the cable lifetime even when the nuclear power plant is in operation.

Keywords: Autonomous robot, Cable Inspection, Indenter, Nuclear Power Plant

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
6829 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: Data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128