Search results for: Information Technology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5701

Search results for: Information Technology

3871 Requirements Engineering for Enterprise Applications Development: Seven Challenges in Higher Education Environment

Authors: Jamaludin Sallim

Abstract:

This paper describes the challenges on the requirements engineering for developing an enterprise applications in higher education environment. The development activities include software implementation, maintenance, and enhancement and support for online transaction processing and overnight batch processing. Generally, an enterprise application for higher education environment may include Student Information System (SIS), HR/Payroll system, Financial Systems etc. By the way, there are so many challenges in requirement engineering phases in order to provide two distinctive services that are production processing support and systems development.

Keywords: enterprise applications development, enterprise information systems, business process, requirement engineering, requirement standards, software development activities, software requirement reviews.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
3870 Theoretical Background of Dividend Taxation

Authors: Margareta Ilkova, Petr Teply

Abstract:

The article deals with dividends and their distribution from investors from a theoretical point of view. Some studies try to analyzed the reaction of the market on the dividend announcement and found out the change of dividend policy is associated with abnormal returns around the dividend announcement date. Another researches directly questioned the investors about their dividend preference and beliefs. Investors want the dividend from many reasons (e.g. some of them explain the dividend preference by the existence of transaction cost; investors prefer the dividend today, because there is less risky; the managers have private information about the firm). The most controversial theory of dividend policy was developed by Modigliani and Miller (1961) who demonstrated that in the perfect and complete capital markets the dividend policy is irrelevant and the value of the company is independent of its payout policy. Nevertheless, in the real world the capital markets are imperfect, because of asymmetric information, transaction costs, incomplete contracting possibilities and taxes.

Keywords: dividend distribution, taxation, payout policy, investor, Modigliani and Miller theorem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2102
3869 Fuzzy Logic Based Improved Range Free Localization for Wireless Sensor Networks

Authors: Ashok Kumar, Vinod Kumar

Abstract:

Wireless Sensor Networks (WSNs) are used to monitor/observe vast inaccessible regions through deployment of large number of sensor nodes in the sensing area. For majority of WSN applications, the collected data needs to be combined with geographic information of its origin to make it useful for the user; information received from remote Sensor Nodes (SNs) that are several hops away from base station/sink is meaningless without knowledge of its source. In addition to this, location information of SNs can also be used to propose/develop new network protocols for WSNs to improve their energy efficiency and lifetime. In this paper, range free localization protocols for WSNs have been proposed. The proposed protocols are based on weighted centroid localization technique, where the edge weights of SNs are decided by utilizing fuzzy logic inference for received signal strength and link quality between the nodes. The fuzzification is carried out using (i) Mamdani, (ii) Sugeno, and (iii) Combined Mamdani Sugeno fuzzy logic inference. Simulation results demonstrate that proposed protocols provide better accuracy in node localization compared to conventional centroid based localization protocols despite presence of unintentional radio frequency interference from radio frequency (RF) sources operating in same frequency band.

Keywords: localization, range free, received signal strength, link quality indicator, Mamdani fuzzy logic inference, Sugeno fuzzy logic inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2613
3868 Applying Gibbs Sampler for Multivariate Hierarchical Linear Model

Authors: Satoshi Usami

Abstract:

Among various HLM techniques, the Multivariate Hierarchical Linear Model (MHLM) is desirable to use, particularly when multivariate criterion variables are collected and the covariance structure has information valuable for data analysis. In order to reflect prior information or to obtain stable results when the sample size and the number of groups are not sufficiently large, the Bayes method has often been employed in hierarchical data analysis. In these cases, although the Markov Chain Monte Carlo (MCMC) method is a rather powerful tool for parameter estimation, Procedures regarding MCMC have not been formulated for MHLM. For this reason, this research presents concrete procedures for parameter estimation through the use of the Gibbs samplers. Lastly, several future topics for the use of MCMC approach for HLM is discussed.

Keywords: Gibbs sampler, Hierarchical Linear Model, Markov Chain Monte Carlo, Multivariate Hierarchical Linear Model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848
3867 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: Blueprint, ERP, SDLC, Modular.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 367
3866 Tree Based Decomposition of Sunspot Images

Authors: Hossein Mirzaee, Farhad Besharati

Abstract:

Solar sunspot rotation, latitudinal bands are studied based on intelligent computation methods. A combination of image fusion method with together tree decomposition is used to obtain quantitative values about the latitudes of trajectories on sun surface that sunspots rotate around them. Daily solar images taken with SOlar and Heliospheric (SOHO) satellite are fused for each month separately .The result of fused image is decomposed with Quad Tree decomposition method in order to achieve the precise information about latitudes of sunspot trajectories. Such analysis is useful for gathering information about the regions on sun surface and coordinates in space that is more expose to solar geomagnetic storms, tremendous flares and hot plasma gases permeate interplanetary space and help human to serve their technical systems. Here sunspot images in September, November and October in 2001 are used for studying the magnetic behavior of sun.

Keywords: Quad tree decomposition, sunspot image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
3865 Thermography Evaluation on Facial Temperature Recovery after Elastic Gum

Authors: A. Dionísio, L. Roseiro, J. Fonseca, P. Nicolau

Abstract:

Thermography is a non-radiating and contact-free technology which can be used to monitor skin temperature. The efficiency and safety of thermography technology make it a useful tool for detecting and locating thermal changes in skin surface, characterized by increases or decreases in temperature. This work intends to be a contribution for the use of thermography as a methodology for evaluation of skin temperature in the context of orofacial biomechanics. The study aims to identify the oscillations of skin temperature in the left and right hemiface regions of the masseter muscle, during and after thermal stimulus, and estimate the time required to restore the initial temperature after the application of the stimulus. Using a FLIR T430sc camera, a data acquisition protocol was followed with a group of eight volunteers, aged between 22 and 27 years. The tests were performed in a controlled environment with the volunteers in a comfortably static position. The thermal stimulus involves the use of an ice volume with controlled size and contact surface. The skin surface temperature was recorded in two distinct situations, namely without further stimulus and with the additions of a stimulus obtained by a chewing gum. The data obtained were treated using FLIR Research IR Max software. The time required to recover the initial temperature ranged from 20 to 52 minutes when no stimulus was added and varied between 8 and 26 minutes with the chewing gum stimulus. These results show that recovery is faster with the addition of the stimulus and may guide clinicians regarding the pre and post-operative times with ice therapy, in the presence or absence of mechanical stimulus that increases muscle functions (e.g. phonetics or mastication).

Keywords: Thermography, orofacial biomechanics, skin temperature, ice therapy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1118
3864 Intelligent Agent Communication by Using DAML to Build Agent Community Ontology

Authors: Cheng-Hsiung Hung, Hong-Jie Dai, Jason Jen-Yen Chen

Abstract:

This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.

Keywords: Intelligent agent communication, DARPA agent markup language (DAML), Community ontology, Advanced Traveler Information System (ATIS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
3863 A Fast Sign Localization System Using Discriminative Color Invariant Segmentation

Authors: G.P. Nguyen, H.J. Andersen

Abstract:

Building intelligent traffic guide systems has been an interesting subject recently. A good system should be able to observe all important visual information to be able to analyze the context of the scene. To do so, signs in general, and traffic signs in particular, are usually taken into account as they contain rich information to these systems. Therefore, many researchers have put an effort on sign recognition field. Sign localization or sign detection is the most important step in the sign recognition process. This step filters out non informative area in the scene, and locates candidates in later steps. In this paper, we apply a new approach in detecting sign locations using a new color invariant model. Experiments are carried out with different datasets introduced in other works where authors claimed the difficulty in detecting signs under unfavorable imaging conditions. Our method is simple, fast and most importantly it gives a high detection rate in locating signs.

Keywords: Sign localization, color-based segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276
3862 Food Security Model and the Role of Community Empowerment: The Case of a Marginalized Village in Mexico, Tatoxcac, Puebla

Authors: Marco Antonio Lara De la Calleja, María Catalina Ovando Chico, Eduardo Lopez Ruiz

Abstract:

Community empowerment has been proved to be a key element in the solution of the food security problem. As a result of a conceptual analysis, it was found that agricultural production, economic development and governance, are the traditional basis of food security models. Although the literature points to social inclusion as an important factor for food security, no model has considered it as the basis of it. The aim of this research is to identify different dimensions that make an integral model for food security, with emphasis on community empowerment. A diagnosis was made in the study community (Tatoxcac, Zacapoaxtla, Puebla), to know the aspects that impact the level of food insecurity. With a statistical sample integrated by 200 families, the Latin American and Caribbean Food Security Scale (ELCSA) was applied, finding that: in households composed by adults and children, have moderated food insecurity, (ELCSA scale has three levels, low, moderated and high); that result is produced mainly by the economic income capacity and the diversity of the diet on its food. With that being said, a model was developed to promote food security through five dimensions: 1. Regional context of the community; 2. Structure and system of local food; 3. Health and nutrition; 4. Information and technology access; and 5. Self-awareness and empowerment. The specific actions on each axis of the model, allowed a systemic approach needed to attend food security in the community, through the empowerment of society. It is concluded that the self-awareness of local communities is an area of extreme importance, which must be taken into account for participatory schemes to improve food security. In the long term, the model requires the integrated participation of different actors, such as government, companies and universities, to solve something such vital as food security.

Keywords: Community empowerment, food security, model, systemic approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
3861 A Comparative Study of Page Ranking Algorithms for Information Retrieval

Authors: Ashutosh Kumar Singh, Ravi Kumar P

Abstract:

This paper gives an introduction to Web mining, then describes Web Structure mining in detail, and explores the data structure used by the Web. This paper also explores different Page Rank algorithms and compare those algorithms used for Information Retrieval. In Web Mining, the basics of Web mining and the Web mining categories are explained. Different Page Rank based algorithms like PageRank (PR), WPR (Weighted PageRank), HITS (Hyperlink-Induced Topic Search), DistanceRank and DirichletRank algorithms are discussed and compared. PageRanks are calculated for PageRank and Weighted PageRank algorithms for a given hyperlink structure. Simulation Program is developed for PageRank algorithm because PageRank is the only ranking algorithm implemented in the search engine (Google). The outputs are shown in a table and chart format.

Keywords: Web Mining, Web Structure, Web Graph, LinkAnalysis, PageRank, Weighted PageRank, HITS, DistanceRank, DirichletRank,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2805
3860 Unified Fusion Approach with Application to SLAM

Authors: Xinde Li, Xinhan Huang, Min Wang

Abstract:

In this paper, we propose the pre-processor based on the Evidence Supporting Measure of Similarity (ESMS) filter and also propose the unified fusion approach (UFA) based on the general fusion machine coupled with ESMS filter, which improve the correctness and precision of information fusion in any fields of application. Here we mainly apply the new approach to Simultaneous Localization And Mapping (SLAM) of Pioneer II mobile robots. A simulation experiment was performed, where an autonomous virtual mobile robot with sonar sensors evolves in a virtual world map with obstacles. By comparing the result of building map according to the general fusion machine (here DSmT-based fusing machine and PCR5-based conflict redistributor considereded) coupling with ESMS filter and without ESMS filter, it shows the benefit of the selection of the sources as a prerequisite for improvement of the information fusion, and also testifies the superiority of the UFA in dealing with SLAM.

Keywords: DSmT, ESMS filter, SLAM, UFA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1322
3859 GPT Onto: A New Beginning for Malaysia Gross Pollutant Trap Ontology

Authors: Chandrika M.J., Lariyah M.S., Alicia Y.C. Tang

Abstract:

Ontology is widely being used as a tool for organizing information, creating the relation between the subjects within the defined knowledge domain area. Various fields such as Civil, Biology, and Management have successful integrated ontology in decision support systems for managing domain knowledge and to assist their decision makers. Gross pollutant traps (GPT) are devices used in trapping and preventing large items or hazardous particles in polluting and entering our waterways. However choosing and determining GPT is a challenge in Malaysia as there are inadequate GPT data repositories being captured and shared. Hence ontology is needed to capture, organize and represent this knowledge into meaningful information which can be contributed to the efficiency of GPT selection in Malaysia urbanization. A GPT Ontology framework is therefore built as the first step to capture GPT knowledge which will then be integrated into the decision support system. This paper will provide several examples of the GPT ontology, and explain how it is constructed by using the Protégé tool.

Keywords: Gross pollutant Trap, Ontology, Protégé.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
3858 Wireless Sensor Networks for Swiftlet Farms Monitoring

Authors: Al-Khalid Othman, Wan A. Wan Zainal Abidin, Kee M. Lee, Hushairi Zen, Tengku. M. A. Zulcaffle, Kuryati Kipli

Abstract:

This paper provides an in-depth study of Wireless Sensor Network (WSN) application to monitor and control the swiftlet habitat. A set of system design is designed and developed that includes the hardware design of the nodes, Graphical User Interface (GUI) software, sensor network, and interconnectivity for remote data access and management. System architecture is proposed to address the requirements for habitat monitoring. Such applicationdriven design provides and identify important areas of further work in data sampling, communications and networking. For this monitoring system, a sensor node (MTS400), IRIS and Micaz radio transceivers, and a USB interfaced gateway base station of Crossbow (Xbow) Technology WSN are employed. The GUI of this monitoring system is written using a Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) along with Xbow Technology drivers provided by National Instrument. As a result, this monitoring system is capable of collecting data and presents it in both tables and waveform charts for further analysis. This system is also able to send notification message by email provided Internet connectivity is available whenever changes on habitat at remote sites (swiftlet farms) occur. Other functions that have been implemented in this system are the database system for record and management purposes; remote access through the internet using LogMeIn software. Finally, this research draws a conclusion that a WSN for monitoring swiftlet habitat can be effectively used to monitor and manage swiftlet farming industry in Sarawak.

Keywords: Swiftlet, WSN, Habitat Monitoring, Networking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2724
3857 The Comparison of Anchor and Star Schema from a Query Performance Perspective

Authors: Radek Němec

Abstract:

Today's business environment requires that companies have access to highly relevant information in a matter of seconds. Modern Business Intelligence tools rely on data structured mostly in traditional dimensional database schemas, typically represented by star schemas. Dimensional modeling is already recognized as a leading industry standard in the field of data warehousing although several drawbacks and pitfalls were reported. This paper focuses on the analysis of another data warehouse modeling technique - the anchor modeling, and its characteristics in context with the standardized dimensional modeling technique from a query performance perspective. The results of the analysis show information about performance of queries executed on database schemas structured according to principles of each database modeling technique.

Keywords: Data warehousing, anchor modeling, star schema, anchor schema, query performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3298
3856 Optimal Aggregate Production Planning with Fuzzy Data

Authors: Wen-Lung Huang, Shih-Pin Chen

Abstract:

This paper investigates the optimization problem of multi-product aggregate production planning (APP) with fuzzy data. From a comprehensive viewpoint of conserving the fuzziness of input information, this paper proposes a method that can completely describe the membership function of the performance measure. The idea is based on the well-known Zadeh-s extension principle which plays an important role in fuzzy theory. In the proposed solution procedure, a pair of mathematical programs parameterized by possibility level a is formulated to calculate the bounds of the optimal performance measure at a . Then the membership function of the optimal performance measure is constructed by enumerating different values of a . Solutions obtained from the proposed method contain more information, and can offer more chance to achieve the feasible disaggregate plan. This is helpful to the decision-maker in practical applications.

Keywords: fuzzy data, aggregate production planning, membership function, parametric programming

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
3855 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material

Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva

Abstract:

Creating favourable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. This paper describes a holistic approach to teaching mathematics designed to address the primary challenges of such teaching; specifically, the challenge of students’ comprehension. Essentially, this approach consists of (1) establishing links between the attributes of the notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience—value-based emotions, contextual, procedural and communicative—during the educational process; (3) linking together different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modelling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and techniques influence understanding of material used in teaching mathematics was the primary goal. The study included an experiment in which 256 secondary school students took part: 142 in the study group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics — “Derivative” and “Trigonometric functions”—was evaluated. Control group participants were taught using traditional methods. Students in the study group were taught using the holistic method: under teacher’s guidance, they carried out assignments designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as assignments that required the ability to operate with all modes of presentation. Identification, accounting for and transformation of subjective experience were associated with methods of stimulating the emotional value component of the studied mathematical content (discussions of lesson titles, assignments aimed to create study dominants, performing theme-related physical exercise ...) The use of techniques that forms inter-subject notions based on linkages between, perceptual real and mathematical conceptual spaces proved to be of special interest to the students. Results of the experiment were analysed by presenting students in each of the groups with a final test in each of the studied topics. The test included assignments that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion x2 was used to reveal statistics significance of results (pass-fail the modelling test). Significant difference of results was revealed (p < 0.001), which allowed to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. The total number of completed assignments of each student was analysed as well, with average results calculated for each group. Statistical significance of result differences against the quantitative criterion (number of completed assignments) was determined using Student’s t-test, which showed that students in the study group completed significantly more assignments than those in the control group (p = 0.0001). Authors thus come to the conclusion that suggested increase in the level of comprehension of study material took place as a result of applying implemented methods and techniques.

Keywords: Comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 544
3854 Heterogeneity-Aware Load Balancing for Multimedia Access over Wireless LAN Hotspots

Authors: Yen-Cheng Chen, Gong-Da Fang

Abstract:

Wireless LAN (WLAN) access in public hotspot areas becomes popular in the recent years. Since more and more multimedia information is available in the Internet, there is an increasing demand for accessing multimedia information through WLAN hotspots. Currently, the bandwidth offered by an IEEE 802.11 WLAN cannot afford many simultaneous real-time video accesses. A possible way to increase the offered bandwidth in a hotspot is the use of multiple access points (APs). However, a mobile station is usually connected to the WLAN AP with the strongest received signal strength indicator (RSSI). The total consumed bandwidth cannot be fairly allocated among those APs. In this paper, we will propose an effective load-balancing scheme via the support of the IAPP and SNMP in APs. The proposed scheme is an open solution and doesn-t need any changes in both wireless stations and APs. This makes load balancing possible in WLAN hotspots, where a variety of heterogeneous mobile devices are employed.

Keywords: Wireless LAN, Load balancing, IAPP, SNMP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1756
3853 Adaptive Path Planning for Mobile Robot Obstacle Avoidance

Authors: Rong-Jong Wai, Chia-Ming Liu

Abstract:

Generally speaking, the mobile robot is capable of sensing its surrounding environment, interpreting the sensed information to obtain the knowledge of its location and the environment, planning a real-time trajectory to reach the object. In this process, the issue of obstacle avoidance is a fundamental topic to be challenged. Thus, an adaptive path-planning control scheme is designed without detailed environmental information, large memory size and heavy computation burden in this study for the obstacle avoidance of a mobile robot. In this scheme, the robot can gradually approach its object according to the motion tracking mode, obstacle avoidance mode, self-rotation mode, and robot state selection. The effectiveness of the proposed adaptive path-planning control scheme is verified by numerical simulations of a differential-driving mobile robot under the possible occurrence of obstacle shapes.

Keywords: Adaptive Path Planning, Mobile Robot ObstacleAvoidance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2143
3852 Potentials and Influencing Factors of Dynamic Pricing in Business: Empirical Insights of European Experts

Authors: Christopher Reichstein, Ralf-Christian Härting, Martina Häußler

Abstract:

With a continuously increasing speed of information exchange on the World Wide Web, retailers in the E-Commerce sector are faced with immense possibilities regarding different online purchase processes like dynamic price settings. By use of Dynamic Pricing, retailers are able to set short time price changes in order to optimize producer surplus. The empirical research illustrates the basics of Dynamic Pricing and identifies six influencing factors of Dynamic Pricing. The results of a structural equation modeling approach show five main drivers increasing the potential of dynamic price settings in the E-Commerce. Influencing factors are the knowledge of customers’ individual willingness to pay, rising sales, the possibility of customization, the data volume and the information about competitors’ pricing strategy.

Keywords: E-commerce, empirical research, experts, Dynamic Pricing (DP), influencing factors, potentials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
3851 A Review: Comparative Study of Diverse Collection of Data Mining Tools

Authors: S. Sarumathi, N. Shanthi, S. Vidhya, M. Sharmila

Abstract:

There have been a lot of efforts and researches undertaken in developing efficient tools for performing several tasks in data mining. Due to the massive amount of information embedded in huge data warehouses maintained in several domains, the extraction of meaningful pattern is no longer feasible. This issue turns to be more obligatory for developing several tools in data mining. Furthermore the major aspire of data mining software is to build a resourceful predictive or descriptive model for handling large amount of information more efficiently and user friendly. Data mining mainly contracts with excessive collection of data that inflicts huge rigorous computational constraints. These out coming challenges lead to the emergence of powerful data mining technologies. In this survey a diverse collection of data mining tools are exemplified and also contrasted with the salient features and performance behavior of each tool.

Keywords: Business Analytics, Data Mining, Data Analysis, Machine Learning, Text Mining, Predictive Analytics, Visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3346
3850 Marketing and Commercial Activities Offered on Websites of European Union Banks

Authors: Mario Spremić, Natalija Kokolek, Božidar Jaković, Jurica Šimurina

Abstract:

This paper deals with various questions related to functionality and providing banking services in the European union on the Internet. Due to the fact that we live in the information technologies era, the Internet become a new space for doing economic and business activities in all areas, and especially important in banking. Accepting the busy tempo of life, in the past several years electronic banking has become necessity and a must for most users of banking services. On a sample of 300 web sites of the banks operating in European Union (EU) we conduct the research on the functionality of e-banking services offered through banks web sites with the key objective to reveal to what extent the information technologies are used in their business operations. Characteristics of EU banks websites will be examined and compared to the basic groups of business activities on the web. Also some recommendations for the successful bank web sites will be provided.

Keywords: Electronic banking, electronic business, European Union banks, internet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
3849 A POX Controller Module to Prepare a List of Flow Header Information Extracted from SDN Traffic

Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin

Abstract:

Software Defined Networking (SDN) is a paradigm designed to facilitate the way of controlling the network dynamically and with more agility. Network traffic is a set of flows, each of which contains a set of packets. In SDN, a matching process is performed on every packet coming to the network in the SDN switch. Only the headers of the new packets will be forwarded to the SDN controller. In terminology, the flow header fields are called tuples. Basically, these tuples are 5-tuple: the source and destination IP addresses, source and destination ports, and protocol number. This flow information is used to provide an overview of the network traffic. Our module is meant to extract this 5-tuple with the packets and flows numbers and show them as a list. Therefore, this list can be used as a first step in the way of detecting the DDoS attack. Thus, this module can be considered as the beginning stage of any flow-based DDoS detection method.

Keywords: Matching, OpenFlow tables, POX controller, SDN, table-miss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1201
3848 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security

Authors: Magdalena Musiał-Karg

Abstract:

The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.

Keywords: Electronic democracy, electronic participation, electronic voting, security of e-voting, ICT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
3847 Studies on Properties of Knowledge Dependency and Reduction Algorithm in Tolerance Rough Set Model

Authors: Chen Wu, Lijuan Wang

Abstract:

Relation between tolerance class and indispensable attribute and knowledge dependency in rough set model with tolerance relation is explored. After giving definitions and concepts of knowledge dependency and knowledge dependency degree for incomplete information system in tolerance rough set model by distinguishing decision attribute containing missing attribute value or not, the result of maintaining reflectivity, transitivity, augmentation, decomposition law and merge law for complete knowledge dependency is proved. Knowledge dependency degrees (not complete knowledge dependency degrees) only satisfy some laws after transitivity, augmentation and decomposition operations. An algorithm to solve attribute reduction in an incomplete decision table is designed. The correctness is checked by an example.

Keywords: Incomplete information system, rough set, tolerance relation, knowledge dependence, attribute reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
3846 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants

Authors: B. Mukanova, N. Glazyrina, S. Glazyrin

Abstract:

The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.

Keywords: Direct problem, multiparametric optimization, optimization parameters, water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2123
3845 Perceived Risks in Business-to-Consumer Online Contracts: An Empirical Study in Saudi Arabia

Authors: Shaya Alshahrani

Abstract:

Perceived risks play a major role in consumer intentions, behaviors, attitudes, and decisions about online shopping in the KSA. This paper investigates the influence of six perceived risk dimensions on Saudi consumers: product risk, information risk, financial risk, privacy and security risk, delivery risk, and terms and conditions risk empirically. To ensure the success of this study, a random survey was distributed to reflect the consumers’ perceived risk and to enable the generalization of the results. Data were collected from 323 respondents in the Kingdom of Saudi Arabia (KSA): 50 who had never shopped online and 273 who had done so. The results indicated that all six risks influenced the respondents’ perceptions of online shopping. The non-online shoppers perceived financial and delivery risks as the most significant barriers to online shopping. This was followed closely by performance, information, and privacy and security risks. Terms and conditions were perceived as less significant. The online consumers considered delivery and performance risks to be the most significant influences on internet shopping. This was followed closely by information and terms and conditions. Financial and privacy and security risks were perceived as less significant. This paper argues that introducing adequate legal solutions to addressing related problems arising from this study is an urgent need. This may enhance consumer trust in the KSA online market, increase consumers’ intentions regarding online shopping, and improve consumer protection.

Keywords: Perceived risk, consumer protection, online shopping, Saudi Arabia, online contracts, e-commerce.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 877
3844 DWT Based Image Steganalysis

Authors: Indradip Banerjee, Souvik Bhattacharyya, Gautam Sanyal

Abstract:

‘Steganalysis’ is one of the challenging and attractive interests for the researchers with the development of information hiding techniques. It is the procedure to detect the hidden information from the stego created by known steganographic algorithm. In this paper, a novel feature based image steganalysis technique is proposed. Various statistical moments have been used along with some similarity metric. The proposed steganalysis technique has been designed based on transformation in four wavelet domains, which include Haar, Daubechies, Symlets and Biorthogonal. Each domain is being subjected to various classifiers, namely K-nearest-neighbor, K* Classifier, Locally weighted learning, Naive Bayes classifier, Neural networks, Decision trees and Support vector machines. The experiments are performed on a large set of pictures which are available freely in image database. The system also predicts the different message length definitions.

Keywords: Steganalysis, Moments, Wavelet Domain, KNN, K*, LWL, Naive Bayes Classifier, Neural networks, Decision trees, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556
3843 Podcasting as an Instructional Method: Case Study of a School Psychology Class

Authors: Jeff A. Tysinger, Dawn P. Tysinger

Abstract:

There has been considerable growth in online learning. Researchers continue to explore the impact various methods of delivery. Podcasting is a popular method for sharing information. The purpose of this study was to examine the impact of student motivation and the perception of the acquisition of knowledge in an online environment of a skill-based class. 25 students in a school psychology graduate class completed a pretest and posttest examining podcast use and familiarity. In addition, at the completion of the course they were administered a modified version of the Instructional Materials Motivation Survey. The four subscales were examined (attention, relevance, confidence, and satisfaction). Results indicated that students are motivated, they perceive podcasts as positive instructional tools, and students are successful in acquiring the needed information. Additional benefits of using podcasts and recommendations in school psychology training are discussed.

Keywords: Motivation, online learning, pedagogy, podcast.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 730
3842 Interactive Garments: Flexible Technologies for Textile Integration

Authors: Anupam Bhatia

Abstract:

Upon reviewing the literature and the pragmatic work done in the field of E- textiles, it is observed that the applications of wearable technologies have found a steady growth in the field of military, medical, industrial, sports; whereas fashion is at a loss to know how to treat this technology and bring it to market. The purpose of this paper is to understand the practical issues of integration of electronics in garments; cutting patterns for mass production, maintaining the basic properties of textiles and daily maintenance of garments that hinder the wide adoption of interactive fabric technology within Fashion and leisure wear. To understand the practical hindrances an experimental and laboratory approach is taken. “Techno Meets Fashion” has been an interactive fashion project where sensor technologies have been embedded with textiles that result in set of ensembles that are light emitting garments, sound sensing garments, proximity garments, shape memory garments etc. Smart textiles, especially in the form of textile interfaces, are drastically underused in fashion and other lifestyle product design. Clothing and some other textile products must be washable, which subjects to the interactive elements to water and chemical immersion, physical stress, and extreme temperature. The current state of the art tends to be too fragile for this treatment. The process for mass producing traditional textiles becomes difficult in interactive textiles. As cutting patterns from larger rolls of cloth and sewing them together to make garments breaks and reforms electronic connections in an uncontrolled manner. Because of this, interactive fabric elements are integrated by hand into textiles produced by standard methods. The Arduino has surely made embedding electronics into textiles much easier than before; even then electronics are not integral to the daily wear garments. Soft and flexible interfaces of MEMS (micro sensors and Micro actuators) can be an option to make this possible by blending electronics within E-textiles in a way that’s seamless and still retains functions of the circuits as well as the garment. Smart clothes, which offer simultaneously a challenging design and utility value, can be only mass produced if the demands of the body are taken care of i.e. protection, anthropometry, ergonomics of human movement, thermo- physiological regulation.

Keywords: Ambient Intelligence, Proximity Sensors, Shape Memory Materials, Sound sensing garments, Wearable Technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237