Search results for: malware information sharing platform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4540

Search results for: malware information sharing platform

3820 A Novel Steganographic Method for Gray-Level Images

Authors: Ahmad T. Al-Taani, Abdullah M. AL-Issa

Abstract:

In this work we propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by dividing the cover into blocks of equal sizes and then embeds the message in the edge of the block depending on the number of ones in left four bits of the pixel. The proposed approach is tested on a database consists of 100 different images. Experimental results, compared with other methods, showed that the proposed approach hide more large information and gave a good visual quality stego-image that can be seen by human eyes.

Keywords: Data Embedding, Cryptography, Watermarking, Steganography, Least Significant Bit, Information Hiding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2243
3819 Availability Strategy of Medical Information for Telemedicine Services

Authors: Rozo D. Juan Felipe, Ramírez L. Leonardo Juan, Puerta A. Gabriel Alberto

Abstract:

The telemedicine services require correct computing resource management to guarantee productivity and efficiency for medical and non-medical staff. The aim of this study was to examine web management strategies to ensure the availability of resources and services in telemedicine so as to provide medical information management with an accessible strategy. In addition, to evaluate the quality-of-service parameters, the followings were measured: delays, throughput, jitter, latency, available bandwidth, percent of access and denial of services based of web management performance map with profiles permissions and database management. Through 24 different test scenarios, the results show 100% in availability of medical information, in relation to access of medical staff to web services, and quality of service (QoS) of 99% because of network delay and performance of computer network. The findings of this study suggest that the proposed strategy of web management is an ideal solution to guarantee the availability, reliability, and accessibility of medical information. Finally, this strategy offers seven user profile used at telemedicine center of Bogota-Colombia keeping QoS parameters suitable to telemedicine services.

Keywords: Availability, medical information, QoS, strategy, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1270
3818 Multi-Criteria Spatial Analysis for the Localization of Production Structures. Analytic Hierarchy Process and Geographical Information Systems in the Case of Expanding an Industrial Area

Authors: Gianluigi De Mare, Pierluigi Morano, Antonio Nesticò

Abstract:

Among the numerous economic evaluation techniques currently available, Multi-criteria Spatial Analysis lends itself to solving localization problems of property complexes and, in particular, production plants. The methodology involves the use of Geographical Information Systems (GIS) and the mapping overlay technique, which overlaps the different information layers of a territory in order to obtain an overview of the parameters that characterize it. This first phase is used to detect possible settlement surfaces of a new agglomeration, subsequently selected through Analytic Hierarchy Process (AHP), so as to choose the best alternative. The result ensures the synthesis of a multidimensional profile that expresses both the quantitative and qualitative effects. Each criterion can be given a different weight.

Keywords: Multi-criteria Spatial Analysis, Analytic Hierarchy Process, Geographical Information Systems, localization of industrial areas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
3817 Network Intrusion Detection Design Using Feature Selection of Soft Computing Paradigms

Authors: T. S. Chou, K. K. Yen, J. Luo

Abstract:

The network traffic data provided for the design of intrusion detection always are large with ineffective information and enclose limited and ambiguous information about users- activities. We study the problems and propose a two phases approach in our intrusion detection design. In the first phase, we develop a correlation-based feature selection algorithm to remove the worthless information from the original high dimensional database. Next, we design an intrusion detection method to solve the problems of uncertainty caused by limited and ambiguous information. In the experiments, we choose six UCI databases and DARPA KDD99 intrusion detection data set as our evaluation tools. Empirical studies indicate that our feature selection algorithm is capable of reducing the size of data set. Our intrusion detection method achieves a better performance than those of participating intrusion detectors.

Keywords: Intrusion detection, feature selection, k-nearest neighbors, fuzzy clustering, Dempster-Shafer theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
3816 The Effectiveness of Banks’ Web Sites: A Study of Turkish Banking Sector

Authors: Raif Parlakkaya, Huseyin Cetin, Duygu Irdiren

Abstract:

By the development of World Wide Web, the usage rate of Internet has rapidly grown globally; and provided a basis for the emergence of electronic business. As well as other sectors, the banking sector has adopted the use of internet with the developments in information and communication technologies. Due to the public disclosure and transparency principle of Corporate Governance, the importance of information disclosure of banks on their web sites has increased significantly. For the purpose of this study, a Bank Disclosure Attribute Index (BDAI) in Turkey has been constructed through classifying the information disclosure on banks’ web sites into general, financial, investors and corporate governance attributes. All 47 banks in Turkish Banking System have been evaluated according to the index with the aim of providing a comparison between banks. By Chi Square Test, Pearson Correlation, T-Test, and ANOVA statistical tools, it has been concluded that the majority of banks in Turkey have shared information on their web sites adequately with respect to their total index score. Although there is a positive correlation between various types of information on banks’ web sites, there is no uniformity among them. Also, no significant difference between various types of information disclosure and bank types has been observed. Compared with the total index score averages of the five largest banks in Turkey, there are some banks that need to improve the content of their web sites.

Keywords: Banking sector, public disclosure, Turkey, web site evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1401
3815 Model based Soft-Sensor for Industrial Crystallization: On-line Mass of Crystals and Solubility Measurement

Authors: Cédric Damour, Michel Benne, Brigitte Grondin-Perez, Jean-Pierre Chabriat

Abstract:

Monitoring and control of cane sugar crystallization processes depend on the stability of the supersaturation (σ ) state. The most widely used information to represent σ is the electrical conductivity κ of the solutions. Nevertheless, previous studies point out the shortcomings of this approach: κ may be regarded as inappropriate to guarantee an accurate estimation of σ in impure solutions. To improve the process control efficiency, additional information is necessary. The mass of crystals in the solution ( c m ) and the solubility (mass ratio of sugar to water / s w m m ) are relevant to complete information. Indeed, c m inherently contains information about the mass balance and / s w m m contains information about the supersaturation state of the solution. The main problem is that c m and / s w m m are not available on-line. In this paper, a model based soft-sensor is presented for a final crystallization stage (C sugar). Simulation results obtained on industrial data show the reliability of this approach, c m and the crystal content ( cc ) being estimated with a sufficient accuracy for achieving on-line monitoring in industry

Keywords: Soft-sensor, on-line monitoring, cane sugarcrystallization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2102
3814 The Development of a Narrative Management System: Storytelling in Knowledge Management

Authors: Savita K.S, Hazwani H., Kalid K. S.

Abstract:

This paper presents a narrative management system for organizations to capture organization's tacit knowledge through stories. The intention of capturing tacit knowledge is to address the problem that comes with the mobility of workforce in organisation. Storytelling in knowledge management context is seen as a powerful management tool to communicate tacit knowledge in organization. This narrative management system is developed firstly to enable uploading of many types of knowledge sharing stories, from general to work related-specific stories and secondly, each video has comment functionality where knowledge users can post comments to other knowledge users. The narrative management system allows the stories to browse, search and view by the users. In the system, stories are stored in a video repository. Stories that were produced from this framework will improve learning, knowledge transfer facilitation and tacit knowledge quality in an organization.

Keywords: Knowledge Management, Storytelling, Stories, Tacit Knowledge

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420
3813 Segmentation of Breast Lesions in Ultrasound Images Using Spatial Fuzzy Clustering and Structure Tensors

Authors: Yan Xu, Toshihiro Nishimura

Abstract:

Segmentation in ultrasound images is challenging due to the interference from speckle noise and fuzziness of boundaries. In this paper, a segmentation scheme using fuzzy c-means (FCM) clustering incorporating both intensity and texture information of images is proposed to extract breast lesions in ultrasound images. Firstly, the nonlinear structure tensor, which can facilitate to refine the edges detected by intensity, is used to extract speckle texture. And then, a spatial FCM clustering is applied on the image feature space for segmentation. In the experiments with simulated and clinical ultrasound images, the spatial FCM clustering with both intensity and texture information gets more accurate results than the conventional FCM or spatial FCM without texture information.

Keywords: fuzzy c-means, spatial information, structure tensor, ultrasound image segmentation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
3812 On the Performance of Information Criteria in Latent Segment Models

Authors: Jaime R. S. Fonseca

Abstract:

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

Keywords: Quantitative Methods, Multivariate Data Analysis, Clustering, Finite Mixture Models, Information Theoretical Criteria, Simulation experiments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
3811 Information Retrieval in Domain Specific Search Engine with Machine Learning Approaches

Authors: Shilpy Sharma

Abstract:

As the web continues to grow exponentially, the idea of crawling the entire web on a regular basis becomes less and less feasible, so the need to include information on specific domain, domain-specific search engines was proposed. As more information becomes available on the World Wide Web, it becomes more difficult to provide effective search tools for information access. Today, people access web information through two main kinds of search interfaces: Browsers (clicking and following hyperlinks) and Query Engines (queries in the form of a set of keywords showing the topic of interest) [2]. Better support is needed for expressing one's information need and returning high quality search results by web search tools. There appears to be a need for systems that do reasoning under uncertainty and are flexible enough to recover from the contradictions, inconsistencies, and irregularities that such reasoning involves. In a multi-view problem, the features of the domain can be partitioned into disjoint subsets (views) that are sufficient to learn the target concept. Semi-supervised, multi-view algorithms, which reduce the amount of labeled data required for learning, rely on the assumptions that the views are compatible and uncorrelated. This paper describes the use of semi-structured machine learning approach with Active learning for the “Domain Specific Search Engines". A domain-specific search engine is “An information access system that allows access to all the information on the web that is relevant to a particular domain. The proposed work shows that with the help of this approach relevant data can be extracted with the minimum queries fired by the user. It requires small number of labeled data and pool of unlabelled data on which the learning algorithm is applied to extract the required data.

Keywords: Search engines; machine learning, Informationretrieval, Active logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
3810 Alive Cemeteries with Augmented Reality and Semantic Web Technologies

Authors: TamásMatuszka, Attila Kiss

Abstract:

Due the proliferation of smartphones in everyday use, several different outdoor navigation systems have become available. Since these smartphones are able to connect to the Internet, the users can obtain location-based information during the navigation as well. The users could interactively get to know the specifics of a particular area (for instance, ancient cultural area, Statue Park, cemetery) with the help of thus obtained information. In this paper, we present an Augmented Reality system which uses Semantic Web technologies and is based on the interaction between the user and the smartphone. The system allows navigating through a specific area and provides information and details about the sight an interactive manner.

Keywords: Augmented Reality, Semantic Web, Human Computer Interaction, Mobile Application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2686
3809 Learning and Evaluating Possibilistic Decision Trees using Information Affinity

Authors: Ilyes Jenhani, Salem Benferhat, Zied Elouedi

Abstract:

This paper investigates the issue of building decision trees from data with imprecise class values where imprecision is encoded in the form of possibility distributions. The Information Affinity similarity measure is introduced into the well-known gain ratio criterion in order to assess the homogeneity of a set of possibility distributions representing instances-s classes belonging to a given training partition. For the experimental study, we proposed an information affinity based performance criterion which we have used in order to show the performance of the approach on well-known benchmarks.

Keywords: Data mining from uncertain data, Decision Trees, Possibility Theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
3808 MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes

Authors: Achraf El Allali, John R. Rose

Abstract:

A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.

Keywords: Coding Non-coding Classification, Entropy, GeneRecognition, Mutual Information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703
3807 Accounting Information Systems of Kuwaiti Companies: Obstacles and Barriers

Authors: Haya Y Alobaid

Abstract:

The aim of this paper is to identify and discuss the obstacles to the ability of the accounting information systems of Kuwaiti companies to deal with electronic commerce, and then to propose appropriate solutions to overcome the barriers. The study revealed a remarkable decrease in external auditors who have professional certification. The results also showed an agreement regarding the accounting systems and the ability to deal with e-commerce, with a different degree of importance, despite the presence of obstacles to the ability of accounting systems in dealing with different companies.

Keywords: Accounting information systems, obstacle, barriers, electronic commerce, Kuwait companies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1373
3806 A Survey of Job Scheduling and Resource Management in Grid Computing

Authors: Raksha Sharma, Vishnu Kant Soni, Manoj Kumar Mishra, Prachet Bhuyan

Abstract:

Grid computing is a form of distributed computing that involves coordinating and sharing computational power, data storage and network resources across dynamic and geographically dispersed organizations. Scheduling onto the Grid is NP-complete, so there is no best scheduling algorithm for all grid computing systems. An alternative is to select an appropriate scheduling algorithm to use in a given grid environment because of the characteristics of the tasks, machines and network connectivity. Job and resource scheduling is one of the key research area in grid computing. The goal of scheduling is to achieve highest possible system throughput and to match the application need with the available computing resources. Motivation of the survey is to encourage the amateur researcher in the field of grid computing, so that they can understand easily the concept of scheduling and can contribute in developing more efficient scheduling algorithm. This will benefit interested researchers to carry out further work in this thrust area of research.

Keywords: Grid Computing, Job Scheduling, ResourceScheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3387
3805 Improved Simultaneous Performance in the Time Domain and in the Frequency Domain

Authors: Azeddine Ghodbane, David Bensoussan, Maher Hammami

Abstract:

In this study, we introduce an alternative adaptive architecture that enhances both time and frequency performance, helpfully mitigating the effects of disturbances from the input plant and external disturbances affecting the output. To facilitate superior performance in both the time and frequency domains, we have developed a user-friendly interactive design methods using the GeoGebra platform.

Keywords: Control theory, decentralized control, sensitivity theory, input-output stability theory, robust multivariable feedback control design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138
3804 A Weighted-Profiling Using an Ontology Basefor Semantic-Based Search

Authors: Hikmat A. M. Abd-El-Jaber, Tengku M. T. Sembok

Abstract:

The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are not really satisfied with their search results. Using ontologies for semantic-based searching is likely the key solution. Adopting profiling approach and using ontology base characteristics, this work proposes a strategy for finding the exact meaning of the query terms in order to retrieve relevant information according to user needs. The evaluation of conducted experiments has shown the effectiveness of the suggested methodology and conclusion is presented.

Keywords: information retrieval, user profiles, semantic Web, ontology, search engine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3183
3803 A Soft Systems Methodology Perspective on Data Warehousing Education Improvement

Authors: R. Goede, E. Taylor

Abstract:

This paper demonstrates how the soft systems methodology can be used to improve the delivery of a module in data warehousing for fourth year information technology students. Graduates in information technology needs to have academic skills but also needs to have good practical skills to meet the skills requirements of the information technology industry. In developing and improving current data warehousing education modules one has to find a balance in meeting the expectations of various role players such as the students themselves, industry and academia. The soft systems methodology, developed by Peter Checkland, provides a methodology for facilitating problem understanding from different world views. In this paper it is demonstrated how the soft systems methodology can be used to plan the improvement of data warehousing education for fourth year information technology students.

Keywords: Data warehousing, education, soft systems methodology, stakeholders, systems thinking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
3802 Cooperative Data Caching in WSN

Authors: Narottam Chand

Abstract:

Wireless sensor networks (WSNs) have gained tremendous attention in recent years due to their numerous applications. Due to the limited energy resource, energy efficient operation of sensor nodes is a key issue in wireless sensor networks. Cooperative caching which ensures sharing of data among various nodes reduces the number of communications over the wireless channels and thus enhances the overall lifetime of a wireless sensor network. In this paper, we propose a cooperative caching scheme called ZCS (Zone Cooperation at Sensors) for wireless sensor networks. In ZCS scheme, one-hop neighbors of a sensor node form a cooperative cache zone and share the cached data with each other. Simulation experiments show that the ZCS caching scheme achieves significant improvements in byte hit ratio and average query latency in comparison with other caching strategies.

Keywords: Admission control, cache replacement, cooperative caching, WSN, zone cooperation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2733
3801 Analysis of Secondary School Students’ Perceptions about Information Technologies through a Word Association Test

Authors: Fetah Eren, Ismail Sahin, Ismail Celik, Ahmet Oguz Akturk

Abstract:

The aim of this study is to discover secondary school students’ perceptions related to information technologies and the connections between concepts in their cognitive structures. A word association test consisting of six concepts related to information technologies is used to collect data from 244 secondary school students. Concept maps that present students’ cognitive structures are drawn with the help of frequency data. Data are analyzed and interpreted according to the connections obtained as a result of the concept maps. It is determined students associate most with these concepts—computer, Internet, and communication of the given concepts, and associate least with these concepts—computer-assisted education and information technologies. These results show the concepts, Internet, communication, and computer, are an important part of students’ cognitive structures. In addition, students mostly answer computer, phone, game, Internet and Facebook as the key concepts. These answers show students regard information technologies as a means for entertainment and free time activity, not as a means for education.

Keywords: Word association test, cognitive structure, information technology, secondary school.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2054
3800 Advanced Travel Information System in Heterogeneous Networks

Authors: Hsu-Yung Cheng, Victor Gau, Chih-Wei Huang, Jenq-Neng Hwang, Chih-Chang Yu

Abstract:

In order to achieve better road utilization and traffic efficiency, there is an urgent need for a travel information delivery mechanism to assist the drivers in making better decisions in the emerging intelligent transportation system applications. In this paper, we propose a relayed multicast scheme under heterogeneous networks for this purpose. In the proposed system, travel information consisting of summarized traffic conditions, important events, real-time traffic videos, and local information service contents is formed into layers and multicasted through an integration of WiMAX infrastructure and Vehicular Ad hoc Networks (VANET). By the support of adaptive modulation and coding in WiMAX, the radio resources can be optimally allocated when performing multicast so as to dynamically adjust the number of data layers received by the users. In addition to multicast supported by WiMAX, a knowledge propagation and information relay scheme by VANET is designed. The experimental results validate the feasibility and effectiveness of the proposed scheme.

Keywords: Intelligent Transportation Systems, RelayedMulticast, WiMAX, Vehicular Ad hoc Networks (VANET).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
3799 Modified Diffie-Hellman Protocol By Extend The Theory of The Congruence

Authors: Rand Alfaris, Mohamed Rushdan MD Said, Mohamed Othman, Fudziah Ismail

Abstract:

This paper is introduced a modification to Diffie- Hellman protocol to be applicable on the decimal numbers, which they are the numbers between zero and one. For this purpose we extend the theory of the congruence. The new congruence is over the set of the real numbers and it is called the “real congruence" or the “real modulus". We will refer to the existing congruence by the “integer congruence" or the “integer modulus". This extension will define new terms and redefine the existing terms. As the properties and the theorems of the integer modulus are extended as well. Modified Diffie-Hellman key exchange protocol is produced a sharing, secure and decimal secret key for the the cryptosystems that depend on decimal numbers.

Keywords: Extended theory of the congruence, modified Diffie- Hellman protocol.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
3798 Optimal Document Archiving and Fast Information Retrieval

Authors: Hazem M. El-Bakry, Ahmed A. Mohammed

Abstract:

In this paper, an intelligent algorithm for optimal document archiving is presented. It is kown that electronic archives are very important for information system management. Minimizing the size of the stored data in electronic archive is a main issue to reduce the physical storage area. Here, the effect of different types of Arabic fonts on electronic archives size is discussed. Simulation results show that PDF is the best file format for storage of the Arabic documents in electronic archive. Furthermore, fast information detection in a given PDF file is introduced. Such approach uses fast neural networks (FNNs) implemented in the frequency domain. The operation of these networks relies on performing cross correlation in the frequency domain rather than spatial one. It is proved mathematically and practically that the number of computation steps required for the presented FNNs is less than that needed by conventional neural networks (CNNs). Simulation results using MATLAB confirm the theoretical computations.

Keywords: Information Storage and Retrieval, Electronic Archiving, Fast Information Detection, Cross Correlation, Frequency Domain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
3797 Approaches to Developing Semantic Web Services

Authors: Jorge Cardoso

Abstract:

It has been recognized that due to the autonomy and heterogeneity, of Web services and the Web itself, new approaches should be developed to describe and advertise Web services. The most notable approaches rely on the description of Web services using semantics. This new breed of Web services, termed semantic Web services, will enable the automatic annotation, advertisement, discovery, selection, composition, and execution of interorganization business logic, making the Internet become a common global platform where organizations and individuals communicate with each other to carry out various commercial activities and to provide value-added services. This paper deals with two of the hottest R&D and technology areas currently associated with the Web – Web services and the semantic Web. It describes how semantic Web services extend Web services as the semantic Web improves the current Web, and presents three different conceptual approaches to deploying semantic Web services, namely, WSDL-S, OWL-S, and WSMO.

Keywords: Semantic Web, Web service, Web process, WWW

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
3796 Adopting Collaborative Business Processes to Prevent the Loss of Information in Public Administration Organisations

Authors: A. Capodieci, G. Del Fiore, L. Mainetti

Abstract:

Recently, the use of web 2.0 tools has increased in companies and public administration organisations. This phenomenon, known as "Enterprise 2.0", has, de facto, modified common organisational and operative practices. This has led “knowledge workers” to change their working practices through the use of Web 2.0 communication tools. Unfortunately, these tools have not been integrated with existing enterprise information systems, a situation that could potentially lead to a loss of information. This is an important problem in an organisational context, because knowledge of information exchanged within the organisation is needed to increase the efficiency and competitiveness of the organisation. In this article we demonstrate that it is possible to capture this knowledge using collaboration processes, which are processes of abstraction created in accordance with design patterns and applied to new organisational operative practices.

Keywords: Business Practices, Business Process Patterns, Collaboration Tools, Enterprise 2.0, Knowledge Workers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
3795 IVE: Virtual Humans AI Prototyping Toolkit

Authors: Cyril Brom, Zuzana Vlckova

Abstract:

IVE toolkit has been created for facilitating research,education and development in the ?eld of virtual storytelling andcomputer games. Primarily, the toolkit is intended for modellingaction selection mechanisms of virtual humans, investigating level-of-detail AI techniques for large virtual environments, and for exploringjoint behaviour and role-passing technique (Sec. V). Additionally, thetoolkit can be used as an AI middleware without any changes. Themain facility of IVE is that it serves for prototyping both the AI andvirtual worlds themselves. The purpose of this paper is to describeIVE?s features in general and to present our current work - includingan educational game - on this platform.Keywords? AI middleware, simulation, virtual world.

Keywords: AI middleware, simulation, virtual world

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
3794 Human Health Risk Assessment from Metals Present in a Soil Contaminated by Crude Oil

Authors: M. A. Stoian, D. M. Cocarta, A. Badea

Abstract:

The main sources of soil pollution due to petroleum contaminants are industrial processes involve crude oil. Soil polluted with crude oil is toxic for plants, animals, and humans. Human exposure to the contaminated soil occurs through different exposure pathways: Soil ingestion, diet, inhalation, and dermal contact. The present study research is focused on soil contamination with heavy metals as a consequence of soil pollution with petroleum products. Human exposure pathways considered are: Accidentally ingestion of contaminated soil and dermal contact. The purpose of the paper is to identify the human health risk (carcinogenic risk) from soil contaminated with heavy metals. The human exposure and risk were evaluated for five contaminants of concern of the eleven which were identified in soil. Two soil samples were collected from a bioremediation platform from Muntenia Region of Romania. The soil deposited on the bioremediation platform was contaminated through extraction and oil processing. For the research work, two average soil samples from two different plots were analyzed: The first one was slightly contaminated with petroleum products (Total Petroleum Hydrocarbons (TPH) in soil was 1420 mg/kgd.w.), while the second one was highly contaminated (TPH in soil was 24306 mg/kgd.w.). In order to evaluate risks posed by heavy metals due soil pollution with petroleum products, five metals known as carcinogenic were investigated: Arsenic (As), Cadmium (Cd), ChromiumVI (CrVI), Nickel (Ni), and Lead (Pb). Results of the chemical analysis performed on samples collected from the contaminated soil evidence soil contamination with heavy metals as following: As in Site 1 = 6.96 mg/kgd.w; As in Site 2 = 11.62 mg/kgd.w, Cd in Site 1 = 0.9 mg/kgd.w; Cd in Site 2 = 1 mg/kgd.w; CrVI was 0.1 mg/kgd.w for both sites; Ni in Site 1 = 37.00 mg/kgd.w; Ni in Site 2 = 42.46 mg/kgd.w; Pb in Site 1 = 34.67 mg/kgd.w; Pb in Site 2 = 120.44 mg/kgd.w. The concentrations for these metals exceed the normal values established in the Romanian regulation, but are smaller than the alert level for a less sensitive use of soil (industrial). Although, the concentrations do not exceed the thresholds, the next step was to assess the human health risk posed by soil contamination with these heavy metals. Results for risk were compared with the acceptable one (10-6, according to World Human Organization). As, expected, the highest risk was identified for the soil with a higher degree of contamination: Individual Risk (IR) was 1.11×10-5 compared with 8.61×10-6

Keywords: Carcinogenic risk, heavy metals, human health risk assessment, soil pollution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
3793 Static and Dynamic Complexity Analysis of Software Metrics

Authors: Kamaljit Kaur, Kirti Minhas, Neha Mehan, Namita Kakkar

Abstract:

Software complexity metrics are used to predict critical information about reliability and maintainability of software systems. Object oriented software development requires a different approach to software complexity metrics. Object Oriented Software Metrics can be broadly classified into static and dynamic metrics. Static Metrics give information at the code level whereas dynamic metrics provide information on the actual runtime. In this paper we will discuss the various complexity metrics, and the comparison between static and dynamic complexity.

Keywords: Static Complexity, Dynamic Complexity, Halstead Metric, Mc Cabe's Metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3189
3792 Education Quality Development for Excellence Performance with Higher Education by Using COBIT 5

Authors: Kemkanit Sanyanunthana

Abstract:

The purpose of this research is to study the management system of information technology which supports the education of five private universities in Thailand, according to the case studies which have been developing their qualities and standards of management and education by service provision of information technology to support the excellence performance. The concept to connect information technology with a suitable system has been created by information technology administrators for development, as a system that can be used throughout the organizations to help reach the utmost benefits of using all resources. Hence, the researcher as a person who has been performing these duties within higher education is interested to do this research by selecting the Control Objective for Information and Related Technology 5 (COBIT 5) for the Malcolm Baldrige National Quality Award (MBNQA) of America, or the National Award which applies the concept of Total Quality Management (TQM) to the organization evaluation. Such evaluation is called the Education Criteria for Performance Excellence (EdPEx) focuses on studying and comparing education quality development for excellent performance using COBIT 5 in terms of information technology to study the problems and obstacles of the investigation process for an information technology system, which is considered as an instrument to drive all organizations to reach the excellence performance of the information technology, and to be the model of evaluation and analysis of the process to be in accordance with the strategic plans of the information technology in the universities. This research is conducted in the form of descriptive and survey research according to the case studies. The data collection were carried out by using questionnaires through the administrators working related to the information technology field, and the research documents related to the change management as the main study. The research can be concluded that the performance based on the APO domain process (ALIGN, PLAN AND ORGANISE) of the COBIT 5 standard frame, which emphasizes concordant governance and management of strategic plans for the organizations, could reach only 95%. This might be because of some restrictions such as organizational cultures; therefore, the researcher has studied and analyzed the management of information technology in universities as a whole, under the organizational structures, to reach the performance in accordance with the overall APO domain which would affect the determined strategic plans to be able to develop based on the excellence performance of information technology, and to apply the risk management system at the organizational level into every performance process which would develop the work effectiveness for the resources management of information technology to reach the utmost benefits. 

Keywords: COBIT 5, APO, EdPEx Criteria, MBNQA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
3791 Response Spectrum Transformation for Seismic Qualification Testing

Authors: Nouredine Bourahla, Farid Bouriche, Yacine Benghalia

Abstract:

Seismic qualification testing for equipments to be mounted on upper storeys of buildings is very demanding in terms of floor spectra. The latter is characterized by high accelerations amplitudes within a narrow frequency band. This article presents a method which permits to cover specified required response spectra beyond the shaking table capability by amplifying the acceleration amplitudes at an appropriate frequency range using a physical intermediate mounted on the platform of the shaker.

Keywords: floor spectra, response spectrum, seismicqualification testing, shaking table

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813