Search results for: Protein Structure Data.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9858

Search results for: Protein Structure Data.

7788 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field

Authors: Nastaran Moosavi, Mohammad Mokhtari

Abstract:

Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.

Keywords: Density, P-impedance, S-impedance, post-stack seismic inversion, pre-stack seismic inversion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2229
7787 Comparative in silico and in vitro Study of N-(1- Methyl-2-Oxo-2-N-Methyl Anilino-Ethyl) Benzene Sulfonamide and Its Analogues as an Anticancer Agent

Authors: Pamita Awasthi, Kirna, Shilpa Dogra, Manu Vatsal, Ritu Barthwal

Abstract:

Doxorubicin, also known as Adriamycin, is an anthracycline class of drug used in cancer chemotherapy. It is used in the treatment of non-Hodgkin’s lymphoma, multiple myeloma, acute leukemia, breast cancer, lung cancer, endometrium cancer and ovary cancers. It functions via intercalating DNA and ultimately killing cancer cells. The major side effects of doxorubicin are hair loss, myelosuppression, nausea & vomiting, oesophagitis, diarrhea, heart damage and liver dysfunction. The minor modifications in the structure of compound exhibit large variation in the biological activity, has prompted us to carry out the synthesis of sulfonamide derivatives. Sulfonamide is an important feature with broad spectrum of biological activity such as antiviral, antifungal, diuretics, antiinflammatory, antibacterial and anticancer activities. Structure of the synthesized compound N-(1-methyl-2-oxo-2-N-methyl anilinoethyl) benzene sulfonamide confirmed by proton nuclear magnetic resonance (1H NMR),13C NMR, Mass and FTIR spectroscopic tools to assure the position of all protons and hence stereochemistry of the molecule. Further we have reported the binding potential of synthesized sulfonamide analogues in comparison to doxorubicin drug using Auto Dock 4.2 software. Computational binding energy (B.E.) and inhibitory constant (Ki) has been evaluated for the synthesized compound in comparison of doxorubicin against Poly (dA-dT).Poly (dA-dT) and Poly (dG-dC).Poly (dG-dC) sequences. The in vitro cytotoxic study against human breast cancer cell lines confirms the better anticancer activity of the synthesized compound over currently in use anticancer drug doxorubicin. The IC50 value of the synthesized compound is 7.12 μM whereas for doxorubicin is 7.2 μM.

Keywords: Anticancer, Auto Dock, Doxorubicin, Sulfonamide.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2342
7786 A Study on Abnormal Behavior Detection in BYOD Environment

Authors: Dongwan Kang, Joohyung Oh, Chaetae Im

Abstract:

Advancement of communication technologies and smart devices in the recent times is leading to changes into the integrated wired and wireless communication environments. Since early days, businesses had started introducing environments for mobile device application to their operations in order to improve productivity (efficiency) and the closed corporate environment gradually shifted to an open structure. Recently, individual user's interest in working environment using mobile devices has increased and a new corporate working environment under the concept of BYOD is drawing attention. BYOD (bring your own device) is a concept where individuals bring in and use their own devices in business activities. Through BYOD, businesses can anticipate improved productivity (efficiency) and also a reduction in the cost of purchasing devices. However, as a result of security threats caused by frequent loss and theft of personal devices and corporate data leaks due to low security, companies are reluctant about adopting BYOD system. In addition, without considerations to diverse devices and connection environments, there are limitations in detecting abnormal behaviors, such as information leaks, using the existing network-based security equipment. This study suggests a method to detect abnormal behaviors according to individual behavioral patterns, rather than the existing signature-based malicious behavior detection, and discusses applications of this method in BYOD environment.

Keywords: BYOD, Security, Anomaly Behavior Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2067
7785 A Decision Support System for Predicting Hospitalization of Hemodialysis Patients

Authors: Jinn-Yi Yeh, Tai-Hsi Wu

Abstract:

Hemodialysis patients might suffer from unhealthy care behaviors or long-term dialysis treatments. Ultimately they need to be hospitalized. If the hospitalization rate of a hemodialysis center is high, its quality of service would be low. Therefore, how to decrease hospitalization rate is a crucial problem for health care. In this study we combined temporal abstraction with data mining techniques for analyzing the dialysis patients' biochemical data to develop a decision support system. The mined temporal patterns are helpful for clinicians to predict hospitalization of hemodialysis patients and to suggest them some treatments immediately to avoid hospitalization.

Keywords: Hemodialysis, Temporal abstract, Data mining, Healthcare quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
7784 A Human Activity Recognition System Based On Sensory Data Related to Object Usage

Authors: M. Abdullah-Al-Wadud

Abstract:

Sensor-based Activity Recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian-based classification, Activity recognition, sensor data, object-usage model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
7783 Dominating Set Algorithm and Trust Evaluation Scheme for Secured Cluster Formation and Data Transferring

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

This paper describes the proficient way of choosing the cluster head based on dominating set algorithm in a wireless sensor network (WSN). The algorithm overcomes the energy deterioration problems by this selection process of cluster heads. Clustering algorithms such as LEACH, EEHC and HEED enhance scalability in WSNs. Dominating set algorithm keeps the first node alive longer than the other protocols previously used. As the dominating set of cluster heads are directly connected to each node, the energy of the network is saved by eliminating the intermediate nodes in WSN. Security and trust is pivotal in network messaging. Cluster head is secured with a unique key. The member can only connect with the cluster head if and only if they are secured too. The secured trust model provides security for data transmission in the dominated set network with the group key. The concept can be extended to add a mobile sink for each or for no of clusters to transmit data or messages between cluster heads and to base station. Data security id preferably high and data loss can be prevented. The simulation demonstrates the concept of choosing cluster heads by dominating set algorithm and trust evaluation using DSTE. The research done is rationalized.

Keywords: Wireless Sensor Networks, LEECH, EEHC, HEED, DSTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
7782 Transmission Line Congestion Management Using Hybrid Fish-Bee Algorithm with Unified Power Flow Controller

Authors: P. Valsalal, S. Thangalakshmi

Abstract:

There is a widespread changeover in the electrical power industry universally from old-style monopolistic outline towards a horizontally distributed competitive structure to come across the demand of rising consumption. When the transmission lines of derestricted system are incapable to oblige the entire service needs, the lines are overloaded or congested. The governor between customer and power producer is nominated as Independent System Operator (ISO) to lessen the congestion without obstructing transmission line restrictions. Among the existing approaches for congestion management, the frequently used approaches are reorganizing the generation and load curbing. There is a boundary for reorganizing the generators, and further loads may not be supplemented with the prevailing resources unless more private power producers are added in the system by considerably raising the cost. Hence, congestion is relaxed by appropriate Flexible AC Transmission Systems (FACTS) devices which boost the existing transfer capacity of transmission lines. The FACTs device, namely, Unified Power Flow Controller (UPFC) is preferred, and the correct placement of UPFC is more vital and should be positioned in the highly congested line. Hence, the weak line is identified by using power flow performance index with the new objective function with proposed hybrid Fish – Bee algorithm. Further, the location of UPFC at appropriate line reduces the branch loading and minimizes the voltage deviation. The power transfer capacity of lines is determined with and without UPFC in the identified congested line of IEEE 30 bus structure and the simulated results are compared with prevailing algorithms. It is observed that the transfer capacity of existing line is increased with the presented algorithm and thus alleviating the congestion.

Keywords: Available line transfer capability, congestion management, FACTS device, hybrid fish-bee algorithm, ISO, UPFC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
7781 The Use of Real Measurements and GPS Data for Noise Mapping of Riyadh City

Authors: M. A. Foda, K. A. Alsaif, M. M. ElMadany, A.S. Aguib

Abstract:

In this paper, the noise maps for the area encircled by the Second Ring Road in Riyadh city are developed based on real measured data. Sound level meters, GPS receivers to determine measurement position, a database program to manage the measured data, and a program to develop the maps are used. A baseline noise level has been established at each short-term site so subsequent monitoring may be conducted to describe changes in Riyadh-s noise environment. Short-term sites are used to show typical daytime and nighttime noise levels at specific locations by short duration grab sampling.

Keywords: Noise mapping, Noise measurements, GPS, noise level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159
7780 A Distributed Cognition Framework to Compare E-Commerce Websites Using Data Envelopment Analysis

Authors: C. lo Storto

Abstract:

This paper presents an approach based on the adoption of a distributed cognition framework and a non parametric multicriteria evaluation methodology (DEA) designed specifically to compare e-commerce websites from the consumer/user viewpoint. In particular, the framework considers a website relative efficiency as a measure of its quality and usability. A website is modelled as a black box capable to provide the consumer/user with a set of functionalities. When the consumer/user interacts with the website to perform a task, he/she is involved in a cognitive activity, sustaining a cognitive cost to search, interpret and process information, and experiencing a sense of satisfaction. The degree of ambiguity and uncertainty he/she perceives and the needed search time determine the effort size – and, henceforth, the cognitive cost amount – he/she has to sustain to perform his/her task. On the contrary, task performing and result achievement induce a sense of gratification, satisfaction and usefulness. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 40 websites of businesses performing electronic commerce in the information technology market. A questionnaire to collect subjective judgements for the websites in the sample was purposely designed and administered to 85 university students enrolled in computer science and information systems engineering undergraduate courses.

Keywords: Website, e-commerce, DEA, distributed cognition, evaluation, comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
7779 Transceiver for Differential Wave Pipe-Lined Serial Interconnect with Surfing

Authors: Bhaskar M., Venkataramani B.

Abstract:

In the literature, surfing technique has been proposed for single ended wave-pipelined serial interconnects to increase the data transfer rate. In this paper a novel surfing technique is proposed for differential wave-pipelined serial interconnects, which uses a 'Controllable inverter pair' for surfing. To evaluate the efficiency of this technique, a transceiver with transmitter, receiver, delay locked loop (DLL) along with 40mm metal 4 interconnects using the proposed surfing technique is implemented in UMC 180nm technology and their performances are studied through post layout simulations. From the study, it is observed that the proposed scheme permits 1.875 times higher data transmission rate compared to the single ended scheme whose maximum data transfer rate is 1.33 GB/s. The proposed scheme has the ability to receive the correct data even with stuck-at-faults in the complementary line.

Keywords: Controllable inverter pair, differential interconnect, serial link, surfing, wave pipelining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
7778 A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)

Authors: Akram M. Zeki, Azizah A. Manaf

Abstract:

Least Significant Bit (LSB) technique is the earliest developed technique in watermarking and it is also the most simple, direct and common technique. It essentially involves embedding the watermark by replacing the least significant bit of the image data with a bit of the watermark data. The disadvantage of LSB is that it is not robust against attacks. In this study intermediate significant bit (ISB) has been used in order to improve the robustness of the watermarking system. The aim of this model is to replace the watermarked image pixels by new pixels that can protect the watermark data against attacks and at the same time keeping the new pixels very close to the original pixels in order to protect the quality of watermarked image. The technique is based on testing the value of the watermark pixel according to the range of each bit-plane.

Keywords: Watermarking, LSB, ISB, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
7777 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: Decision tree, classification, data mining, scholarship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158
7776 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey

Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff

Abstract:

This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.

Keywords: Cruise behavior, on-board environmental factors, on-board experience, user or customer satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 874
7775 Multiple Subcarrier Indoor Geolocation System in MIMO-OFDM WLAN APs Structure

Authors: Abdul Hafiizh, Shigeki Obote, Kenichi Kagoshima

Abstract:

This report aims to utilize existing and future Multiple-Input Multiple-Output Orthogonal Frequency Division Multiplexing Wireless Local Area Network (MIMO-OFDM WLAN) systems characteristics–such as multiple subcarriers, multiple antennas, and channel estimation characteristics–for indoor location estimation systems based on the Direction of Arrival (DOA) and Radio Signal Strength Indication (RSSI) methods. Hybrid of DOA-RSSI methods also evaluated. In the experimental data result, we show that location estimation accuracy performances can be increased by minimizing the multipath fading effect. This is done using multiple subcarrier frequencies over wideband frequencies to estimate one location. The proposed methods are analyzed in both a wide indoor environment and a typical room-sized office. In the experiments, WLAN terminal locations are estimated by measuring multiple subcarriers from arrays of three dipole antennas of access points (AP). This research demonstrates highly accurate, robust and hardware-free add-on software for indoor location estimations based on a MIMO-OFDM WLAN system.

Keywords: Direction of Arrival (DOA), Indoor location estimation method, Multipath Fading, MIMO-OFDM, Received Signal Strength Indication (RSSI), WLAN, Hybrid DOA-RSSI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
7774 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: Data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125
7773 Linking OpenCourseWares and Open Education Resources: Creating an Effective Search and Recommendation System

Authors: Brett E. Shelton, Joel Duffin, Yuxuan Wang, Justin Ball

Abstract:

With a growing number of digital libraries and other open education repositories being made available throughout the world, effective search and retrieval tools are necessary to access the desired materials that surpass the effectiveness of traditional, allinclusive search engines. This paper discusses the design and use of Folksemantic, a platform that integrates OpenCourseWare search, Open Educational Resource recommendations, and social network functionality into a single open source project. The paper describes how the system was originally envisioned, its goals for users, and data that provides insight into how it is actually being used. Data sources include website click-through data, query logs, web server log files and user account data. Based on a descriptive analysis of its current use, modifications to the platform's design are recommended to better address goals of the system, along with recommendations for additional phases of research.

Keywords: Digital libraries, open education, recommendation system, social networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2202
7772 Estimating Shortest Circuit Path Length Complexity

Authors: Azam Beg, P. W. Chandana Prasad, S.M.N.A Senenayake

Abstract:

When binary decision diagrams are formed from uniformly distributed Monte Carlo data for a large number of variables, the complexity of the decision diagrams exhibits a predictable relationship to the number of variables and minterms. In the present work, a neural network model has been used to analyze the pattern of shortest path length for larger number of Monte Carlo data points. The neural model shows a strong descriptive power for the ISCAS benchmark data with an RMS error of 0.102 for the shortest path length complexity. Therefore, the model can be considered as a method of predicting path length complexities; this is expected to lead to minimum time complexity of very large-scale integrated circuitries and related computer-aided design tools that use binary decision diagrams.

Keywords: Monte Carlo circuit simulation data, binary decision diagrams, neural network modeling, shortest path length estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1378
7771 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: Welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 799
7770 The Application of Queuing Theory in Multi-Stage Production Lines

Authors: Hani Shafeek, Muhammed Marsudi

Abstract:

The purpose of this work is examining the multiproduct multi-stage in a battery production line. To improve the performances of an assembly production line by determine the efficiency of each workstation. Data collected from every workstation. The data are throughput rate, number of operator, and number of parts that arrive and leaves during part processing. Data for the number of parts that arrives and leaves are collected at least at the amount of ten samples to make the data is possible to be analyzed by Chi-Squared Goodness Test and queuing theory. Measures of this model served as the comparison with the standard data available in the company. Validation of the task time value resulted by comparing it with the task time value based on the company database. Some performance factors for the multi-product multi-stage in a battery production line in this work are shown. The efficiency in each workstation was also shown. Total production time to produce each part can be determined by adding the total task time in each workstation. To reduce the queuing time and increase the efficiency based on the analysis any probably improvement should be done. One probably action is by increasing the number of operators how manually operate this workstation.

Keywords: Production line, manufacturing, performance measurement, queuing theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3147
7769 Investigation and Identification of a Number of Precious and Semi-Precious Stones Related to Bam Historical Citadel Using Micro Raman Spectroscopy and Scanning Electron Microscopy

Authors: Nazli Darkhal

Abstract:

The use of gems and ornaments has been common in Iran since the beginning of history. The prosperity of the country, the wealth, and the interest of the people of this land in a luxurious and glorious life, combined with beauty, have always attracted the attention of Iranian people to gems and jewelry. Iranians are famous in the world for having a long history of collecting and recognizing precious stones. In this case, we can use the unique treasure of national jewelry. Raman spectroscopy method is one of the oscillating spectroscopy methods that is classified in the group of nondestructive study methods, and like other methods, in addition to several advantages, it also has disadvantages and problems. Micro Raman spectroscopy is one of the different types of Raman spectroscopy in which an optical microscope is combined with a Raman device to provide more capabilities and advantages than its original method. In this way, with the help of Raman spectroscopy and a light microscope, while observing more details from different parts of the historical sample, natural or artificial pigments can be identified in a small part of it. The EDX (Energy Dispersive X ray) electron microscope also functions as the basis for the interaction of the electron beam with the matter. The beams emitted from this interaction can be used to examine samples. In this article, in addition to introducing the micro-Raman spectroscopy method, studies have been conducted on the structure of three samples of existing stones in the historic citadel of Bam. Using this method of study on precious and semi-precious stones, in addition to requiring a short time, can provide us with complete information about the structure and theme of these samples. The results of experiments and gemology of the stones showed that the selected beads are agate and jasper, and they can be placed in the chalcedony group.

Keywords: Bam citadel, precious stones, semi-precious stones, Raman spectroscopy, scanning electron microscope.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 412
7768 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: Data analysis, interferon gamma release assay, statistical methods, tuberculosis infection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
7767 Massive Open Online Course about Content Language Integrated Learning: A Methodological Approach for Content Language Integrated Learning Teachers

Authors: M. Zezou

Abstract:

This paper focuses on the design of a Massive Open Online Course (MOOC) about Content Language Integrated Learning (CLIL) and more specifically about how teachers can use CLIL as an educational approach incorporating technology in their teaching as well. All the four weeks of the MOOC will be presented and a step-by-step analysis of each lesson will be offered. Additionally, the paper includes detailed lesson plans about CLIL lessons with proposed CLIL activities and games in which technology plays a central part. The MOOC is structured based on certain criteria, in order to ensure success, as well as a positive experience that the learners need to have after completing this MOOC. It addresses to all language teachers who would like to implement CLIL into their teaching. In other words, it presents the methodology that needs to be followed so as to successfully carry out a CLIL lesson and achieve the learning objectives set at the beginning of the course. Firstly, in this paper, it is very important to give the definitions of MOOCs and LMOOCs, as well as to explore the difference between a structure-based MOOC (xMOOC) and a connectivist MOOC (cMOOC) and present the criteria of a successful MOOC. Moreover, the notion of CLIL will be explored, as it is necessary to fully understand this concept before moving on to the design of the MOOC. Onwards, the four weeks of the MOOC will be introduced as well as lesson plans will be presented: The type of the activities, the aims of each activity and the methodology that teachers have to follow. Emphasis will be placed on the role of technology in foreign language learning and on the ways in which we can involve technology in teaching a foreign language. Final remarks will be made and a summary of the main points will be offered at the end.

Keywords: Content language integrated learning, connectivist massive open online course, lesson plan, language MOOC, massive open online course criteria, massive open online course, technology, structure-based massive open online course.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910
7766 Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart

Authors: Joval P George, Dr. Zheng Chen, Philip Shaw

Abstract:

This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.

Keywords: Principal component analysis, hotelling's t2 chart, multivariate statistical process control, drinking water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2786
7765 Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

Authors: Samee Ullah Khan, Ishfaq Ahmad

Abstract:

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Keywords: Auctions, data replication, pricing, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
7764 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: Steganalysis, security, fast Fourier transform, streaming media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783
7763 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: Network worms, malware infection propagating malicious code, virus, security, VPN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2811
7762 A Prediction-Based Reversible Watermarking for MRI Images

Authors: Nuha Omran Abokhdair, Azizah Bt Abdul Manaf

Abstract:

Reversible watermarking is a special branch of image watermarking, that is able to recover the original image after extracting the watermark from the image. In this paper, an adaptive prediction-based reversible watermarking scheme is presented, in order to increase the payload capacity of MRI medical images. The scheme divides the image into two parts, Region of Interest (ROI) and Region of Non-Interest (RONI). Two bits are embedded in each embeddable pixel of RONI and one bit is embedded in each embeddable pixel of ROI. The experimental results demonstrate that the proposed scheme is able to achieve high embedding capacity. This is mainly caused by two reasons. First, the pixels that were excluded from data embedding due to overflow/underflow are used for data embedding. Second, large location map that need to be added to watermark data as overhead is eliminated and thus lower data embedding capacity is prevented. Moreover, the scheme provides good visual quality to the watermarked image.

Keywords: Medical image watermarking, reversible watermarking, Difference Expansion, Prediction-Error Expansion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
7761 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident

Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang

Abstract:

In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.

Keywords: PWR, TRACE, SBO, Maanshan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768
7760 Salbutamol Sulphate-Ethylcellulose Tabletted Microcapsules: Pharmacokinetic Study using Convolution Approach

Authors: Ghulam Murtaza, Kalsoom Farzana

Abstract:

The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.

Keywords: Convolution, Dissolution, Pharmacokinetics, Salbutamol sulphate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594
7759 High Performance in Parallel Data Integration: An Empirical Evaluation of the Ratio Between Processing Time and Number of Physical Nodes

Authors: Caspar von Seckendorff, Eldar Sultanow

Abstract:

Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.

Keywords: Process delay, speedup, efficiency, parallel computing, data integration, E-Commerce, Amazon Elastic Compute Cloud (EC2), Hadoop, Nutch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629