Search results for: Student satisfaction data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8003

Search results for: Student satisfaction data

5273 Design and Performance Analysis of One Dimensional Zero Cross-Correlation Coding Technique for a Fixed Wavelength Hopping SAC-OCDMA

Authors: Satyasen Panda, Urmila Bhanja

Abstract:

This paper presents a SAC-OCDMA code with zero cross correlation property to minimize the Multiple Access Interface (MAI) as New Zero Cross Correlation code (NZCC), which is found to be more scalable compared to the other existing SAC-OCDMA codes. This NZCC code is constructed using address segment and data segment. In this work, the proposed NZCC code is implemented in an optical system using the Opti-System software for the spectral amplitude coded optical code-division multiple-access (SAC-OCDMA) scheme. The main contribution of the proposed NZCC code is the zero cross correlation, which reduces both the MAI and PIIN noises. The proposed NZCC code reveals properties of minimum cross-correlation, flexibility in selecting the code parameters and supports a large number of users, combined with high data rate and longer fiber length. Simulation results reveal that the optical code division multiple access system based on the proposed NZCC code accommodates maximum number of simultaneous users with higher data rate transmission, lower Bit Error Rates (BER) and longer travelling distance without any signal quality degradation, as compared to the former existing SAC-OCDMA codes.

Keywords: Cross Correlation, Optical Code Division Multiple Access, Spectral Amplitude Coding Optical Code Division Multiple Access, Multiple Access Interference, Phase Induced Intensity Noise, New Zero Cross Correlation code.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
5272 Collaborative Stylistic Group Project: A Drama Practical Analysis Application

Authors: Omnia F. Elkommos

Abstract:

In the course of teaching stylistics to undergraduate students of the Department of English Language and Literature, Faculty of Arts and Humanities, the linguistic tool kit of theories comes in handy and useful for the better understanding of the different literary genres: Poetry, drama, and short stories. In the present paper, a model of teaching of stylistics is compiled and suggested. It is a collaborative group project technique for use in the undergraduate diverse specialisms (Literature, Linguistics and Translation tracks) class. Students initially are introduced to the different linguistic tools and theories suitable for each literary genre. The second step is to apply these linguistic tools to texts. Students are required to watch videos performing the poems or play, for example, and search the net for interpretations of the texts by other authorities. They should be using a template (prepared by the researcher) that has guided questions leading students along in their analysis. Finally, a practical analysis would be written up using the practical analysis essay template (also prepared by the researcher). As per collaborative learning, all the steps include activities that are student-centered addressing differentiation and considering their three different specialisms. In the process of selecting the proper tools, the actual application and analysis discussion, students are given tasks that request their collaboration. They also work in small groups and the groups collaborate in seminars and group discussions. At the end of the course/module, students present their work also collaboratively and reflect and comment on their learning experience. The module/course uses a drama play that lends itself to the task: ‘The Bond’ by Amy Lowell and Robert Frost. The project results in an interpretation of its theme, characterization and plot. The linguistic tools are drawn from pragmatics, and discourse analysis among others.

Keywords: Applied linguistic theories, collaborative learning, cooperative principle, discourse analysis, drama analysis, group project, online acting performance, pragmatics, speech act theory, stylistics, technology enhanced learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1031
5271 CFD of Oscillating Airfoil Pitch Cycle by using PISO Algorithm

Authors: Muhammad Amjad Sohail, Rizwan Ullah

Abstract:

This research paper presents the CFD analysis of oscillating airfoil during pitch cycle. Unsteady subsonic flow is simulated for pitching airfoil at Mach number 0.283 and Reynolds number 3.45 millions. Turbulent effects are also considered for this study by using K-ω SST turbulent model. Two-dimensional unsteady compressible Navier-Stokes code including two-equation turbulence model and PISO pressure velocity coupling is used. Pressure based implicit solver with first order implicit unsteady formulation is used. The simulated pitch cycle results are compared with the available experimental data. The results have a good agreement with the experimental data. Aerodynamic characteristics during pitch cycles have been studied and validated.

Keywords: Angle of attack, Centre of pressure, subsonic flow, pitching moment coefficient, turbulence mode

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2373
5270 Appreciating, Interpreting and Understanding Posters via Levels of Visual Literacy

Authors: Mona Masood, Zakiah Zain

Abstract:

This study was conducted in Malaysia to discover how meaning and appreciation were construed among 35 Form Five students. Panofsky-s theory was employed to discover the levels of reasoning among students when various types of posters were displayed. The independent variables used were posters that carried explicit and implicit meanings; the moderating variable was students- visual literacy levels while the dependent variable was the implicit interpretation level. One-way ANOVA was applied for the data analysis. The data showed that before students were exposed to Panofsky-s theory, there were differences in thinking between boys, who did not think abstractly or implicit in comparison to girls. The study showed that students- visual literacy in posters depended on the use of visual texts and illustration. This paper discuss further on posters with text only have a tendency to be too abstract as opposed to posters with visuals plus text.

Keywords: explicit visual, implicit visual, visual interpretation, visual literacy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2373
5269 Enhanced Disk-Based Databases Towards Improved Hybrid In-Memory Systems

Authors: Samuel Kaspi, Sitalakshmi Venkatraman

Abstract:

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable inmemory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of diskbased database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of inmemory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Keywords: Concurrency control, disk-based databases, inmemory systems, enhanced memory access (EMA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2010
5268 Performance Analysis of Wireless Ad-Hoc Network Based on EDCA IEEE802.11e

Authors: Shah Ahsanuzzaman Md. Tariq, Fabrizio Granelli

Abstract:

IEEE 802.11e is the enhanced version of the IEEE 802.11 MAC dedicated to provide Quality of Service of wireless network. It supports QoS by the service differentiation and prioritization mechanism. Data traffic receives different priority based on QoS requirements. Fundamentally, applications are divided into four Access Categories (AC). Each AC has its own buffer queue and behaves as an independent backoff entity. Every frame with a specific priority of data traffic is assigned to one of these access categories. IEEE 802.11e EDCA (Enhanced Distributed Channel Access) is designed to enhance the IEEE 802.11 DCF (Distributed Coordination Function) mechanisms by providing a distributed access method that can support service differentiation among different classes of traffic. Performance of IEEE 802.11e MAC layer with different ACs is evaluated to understand the actual benefits deriving from the MAC enhancements.

Keywords: 802.11e, fairness, enhanced distributed channelaccess, access categories, quality of Service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
5267 Dynamic Clustering using Particle Swarm Optimization with Application in Unsupervised Image Classification

Authors: Mahamed G.H. Omran, Andries P Engelbrecht, Ayed Salman

Abstract:

A new dynamic clustering approach (DCPSO), based on Particle Swarm Optimization, is proposed. This approach is applied to unsupervised image classification. The proposed approach automatically determines the "optimum" number of clusters and simultaneously clusters the data set with minimal user interference. The algorithm starts by partitioning the data set into a relatively large number of clusters to reduce the effects of initial conditions. Using binary particle swarm optimization the "best" number of clusters is selected. The centers of the chosen clusters is then refined via the Kmeans clustering algorithm. The experiments conducted show that the proposed approach generally found the "optimum" number of clusters on the tested images.

Keywords: Clustering Validation, Particle Swarm Optimization, Unsupervised Clustering, Unsupervised Image Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2430
5266 Entropy based Expeditive Methodology for Rating Curves Assessment

Authors: D. Mirauda, M. Greco, P. Moscarelli

Abstract:

The river flow forecasting represents a crucial point to employ for improving a management policy addressed to the right use of water resources as well as for conjugating prevention and defense actions against environmental degradation. The difficulties occurring during the field activities encourage the development and implementation of operative computation and measuring methods addressed to time reduction for data acquisition and processing maintaining a good level of accuracy. Therefore, the aim of the present work is to test a new entropy based expeditive methodology for the evaluation of the rating curves on three gauged sections with different geometric and morphological characteristics. The methodology requires the choice of only three verticals along the measure section and the sampling of only the maximum velocity. The results underline how in most conditions the rating curves drawn can replace those built with classic methodologies, simplifying thus the procedures of data monitoring and calculation.

Keywords: gauged station, entropic approach, expeditive methodology, rating curves.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386
5265 Remote Monitoring and Control System of Potentiostat Based on the Internet of Things

Authors: Liang Zhao, Guangwen Wang, Guichang Liu

Abstract:

Constant potometer is an important component of pipeline anti-corrosion systems in the chemical industry. Based on Internet of Things (IoT) technology, Programmable Logic Controller (PLC) technology and database technology, this paper developed a set of a constant potometer remote monitoring management system. The remote monitoring and remote adjustment of the working status of the constant potometer are realized. The system has real-time data display, historical data query, alarm push management, user permission management, and supporting Web access and mobile client application (APP) access. The actual engineering project test results show the stability of the system, which can be widely used in cathodic protection systems.

Keywords: Internet of Things, pipe corrosion protection, potentiostat, remote monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 929
5264 Injury Prevention among Construction Workers: A Case Study on Iranian Steel Bar Bending Workers

Authors: S. Behnam Asl, H. Sadeghi Naeini, L. Sadat Ensaniat, R. Khorshidian, S. Alipour, S. Behnam Asl

Abstract:

Nowadays the construction industry is growing specially among developing counties. Iran also has a critical role in these industries in terms of workers disorders. Work-related musculoskeletal disorders (WMSDs) assign 7% of the whole diseases in the society, which make some limitations. One of the main factors, which are ended to WMSDs, is awkward posture. Steel bar bending is considered as one of the prominent performance among construction workers. In this case study we conducted to find the major tasks of bar benders and the most important related risk factors. This study was carried out among twenty workers (18-45 years) as our volunteer samples in some construction sites with less than 6 floors in two regions of Tehran municipality. The data was gathered through in depth observation, interview and questionnaire. Also postural analysis was done by OWAS. In another part of study we used NMQ for gathering some data about psychosocial effects of work related disorders. Our findings show that 64% of workers were not aware of work risks, also about 59% of workers had troubles in their wrists, hands, and especially among workers who worked in steel bar bending. In 46% cases low back pain were prevalence. Considering with gathered data and results, awkward postures and long term tasks and its duration are known as the main risk factors in WMSDs among construction workers, so work-rest schedule and also tools design should be considered to make an ergonomic condition for the mentioned workers.

Keywords: Bar benders, construction workers, musculoskeletal disorders (WMSDs), OWAS method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3335
5263 Risk Factors’ Analysis on Shanghai Carbon Trading

Authors: Zhaojun Wang, Zongdi Sun, Zhiyuan Liu

Abstract:

First of all, the carbon trading price and trading volume in Shanghai are transformed by Fourier transform, and the frequency response diagram is obtained. Then, the frequency response diagram is analyzed and the Blackman filter is designed. The Blackman filter is used to filter, and the carbon trading time domain and frequency response diagram are obtained. After wavelet analysis, the carbon trading data were processed; respectively, we got the average value for each 5 days, 10 days, 20 days, 30 days, and 60 days. Finally, the data are used as input of the Back Propagation Neural Network model for prediction.

Keywords: Shanghai carbon trading, carbon trading price, carbon trading volume, wavelet analysis, BP neural network model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 943
5262 Renewable Energy System Eolic-Photovoltaic for the Touristic Center La Tranca-Chordeleg in Ecuador

Authors: Christian Castro Samaniego, Daniel Icaza Alvarez, Juan Portoviejo Brito

Abstract:

For this research work, hybrid wind-photovoltaic (SHEF) systems were considered as renewable energy sources that take advantage of wind energy and solar radiation to transform into electrical energy. In the present research work, the feasibility of a wind-photovoltaic hybrid generation system was analyzed for the La Tranca tourist viewpoint of the Chordeleg canton in Ecuador. The research process consisted of the collection of data on solar radiation, temperature, wind speed among others by means of a meteorological station. Simulations were carried out in MATLAB/Simulink based on a mathematical model. In the end, we compared the theoretical radiation-power curves and the measurements made at the site.

Keywords: Hybrid system, wind turbine, modeling, simulation, validation, experimental data, panel, Ecuador.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 715
5261 Wireless Sensor Networks for Swiftlet Farms Monitoring

Authors: Al-Khalid Othman, Wan A. Wan Zainal Abidin, Kee M. Lee, Hushairi Zen, Tengku. M. A. Zulcaffle, Kuryati Kipli

Abstract:

This paper provides an in-depth study of Wireless Sensor Network (WSN) application to monitor and control the swiftlet habitat. A set of system design is designed and developed that includes the hardware design of the nodes, Graphical User Interface (GUI) software, sensor network, and interconnectivity for remote data access and management. System architecture is proposed to address the requirements for habitat monitoring. Such applicationdriven design provides and identify important areas of further work in data sampling, communications and networking. For this monitoring system, a sensor node (MTS400), IRIS and Micaz radio transceivers, and a USB interfaced gateway base station of Crossbow (Xbow) Technology WSN are employed. The GUI of this monitoring system is written using a Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) along with Xbow Technology drivers provided by National Instrument. As a result, this monitoring system is capable of collecting data and presents it in both tables and waveform charts for further analysis. This system is also able to send notification message by email provided Internet connectivity is available whenever changes on habitat at remote sites (swiftlet farms) occur. Other functions that have been implemented in this system are the database system for record and management purposes; remote access through the internet using LogMeIn software. Finally, this research draws a conclusion that a WSN for monitoring swiftlet habitat can be effectively used to monitor and manage swiftlet farming industry in Sarawak.

Keywords: Swiftlet, WSN, Habitat Monitoring, Networking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2719
5260 Selecting Materialized Views Using Two-Phase Optimization with Multiple View Processing Plan

Authors: Jiratta Phuboon-ob, Raweewan Auepanwiriyakul

Abstract:

A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance cost. Therefore, in this paper, we introduce a new approach aimed at solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that our method provides a further improvement in term of query processing cost and view maintenance cost.

Keywords: Data warehouse, materialized views, view selectionproblem, two-phase optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
5259 Paradigm and Paradox: Knowledge Management and Business Ethics

Authors: A. Evans, M. McKinley

Abstract:

Knowledge management (KM) is generally considered to be a positive process in an organisation, facilitating opportunities to achieve competitive advantage via better quality information handling, compilation of expert know-how and rapid response to fluctuations in the business environment. The KM paradigm as portrayed in the literature informs the processes that can increase intangible assets so that corporate knowledge is preserved. However, in some instances, knowledge management exists in a universe of dynamic tension among the conflicting needs to respect privacy and intellectual property (IP), to guard against data theft, to protect national security and to stay within the laws. While the Knowledge Management literature focuses on the bright side of the paradigm, there is also a different side in which knowledge is distorted, suppressed or misappropriated due to personal or organisational motives (the paradox). This paper describes the ethical paradoxes that occur within the taxonomy and deontology of knowledge management and suggests that recognising both the promises and pitfalls of KM requires wisdom.

Keywords: business ethics, data, knowledge, knowledgemanagement, privacy, protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
5258 Mining and Visual Management of XML-Based Image Collections

Authors: Khalil Shihab, Nida Al-Chalabi

Abstract:

This article describes Uruk, the virtual museum of Iraq that we developed for visual exploration and retrieval of image collections. The system largely exploits the loosely-structured hierarchy of XML documents that provides a useful representation method to store semi-structured or unstructured data, which does not easily fit into existing database. The system offers users the capability to mine and manage the XML-based image collections through a web-based Graphical User Interface (GUI). Typically, at an interactive session with the system, the user can browse a visual structural summary of the XML database in order to select interesting elements. Using this intermediate result, queries combining structure and textual references can be composed and presented to the system. After query evaluation, the full set of answers is presented in a visual and structured way.

Keywords: Data-centric XML, graphical user interfaces, information retrieval, case-based reasoning, fuzzy sets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
5257 A Comparison of the Sum of Squares in Linear and Partial Linear Regression Models

Authors: Dursun Aydın

Abstract:

In this paper, estimation of the linear regression model is made by ordinary least squares method and the partially linear regression model is estimated by penalized least squares method using smoothing spline. Then, it is investigated that differences and similarity in the sum of squares related for linear regression and partial linear regression models (semi-parametric regression models). It is denoted that the sum of squares in linear regression is reduced to sum of squares in partial linear regression models. Furthermore, we indicated that various sums of squares in the linear regression are similar to different deviance statements in partial linear regression. In addition to, coefficient of the determination derived in linear regression model is easily generalized to coefficient of the determination of the partial linear regression model. For this aim, it is made two different applications. A simulated and a real data set are considered to prove the claim mentioned here. In this way, this study is supported with a simulation and a real data example.

Keywords: Partial Linear Regression Model, Linear RegressionModel, Residuals, Deviance, Smoothing Spline.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
5256 Life Cycle Datasets for the Ornamental Stone Sector

Authors: Isabella Bianco, Gian Andrea Blengini

Abstract:

The environmental impact related to ornamental stones (such as marbles and granites) is largely debated. Starting from the industrial revolution, continuous improvements of machineries led to a higher exploitation of this natural resource and to a more international interaction between markets. As a consequence, the environmental impact of the extraction and processing of stones has increased. Nevertheless, if compared with other building materials, ornamental stones are generally more durable, natural, and recyclable. From the scientific point of view, studies on stone life cycle sustainability have been carried out, but these are often partial or not very significant because of the high percentage of approximations and assumptions in calculations. This is due to the lack, in life cycle databases (e.g. Ecoinvent, Thinkstep, and ELCD), of datasets about the specific technologies employed in the stone production chain. For example, databases do not contain information about diamond wires, chains or explosives, materials commonly used in quarries and transformation plants. The project presented in this paper aims to populate the life cycle databases with specific data of specific stone processes. To this goal, the methodology follows the standardized approach of Life Cycle Assessment (LCA), according to the requirements of UNI 14040-14044 and to the International Reference Life Cycle Data System (ILCD) Handbook guidelines of the European Commission. The study analyses the processes of the entire production chain (from-cradle-to-gate system boundaries), including the extraction of benches, the cutting of blocks into slabs/tiles and the surface finishing. Primary data have been collected in Italian quarries and transformation plants which use technologies representative of the current state-of-the-art. Since the technologies vary according to the hardness of the stone, the case studies comprehend both soft stones (marbles) and hard stones (gneiss). In particular, data about energy, materials and emissions were collected in marble basins of Carrara and in Beola and Serizzo basins located in the province of Verbano Cusio Ossola. Data were then elaborated through an appropriate software to build a life cycle model. The model was realized setting free parameters that allow an easy adaptation to specific productions. Through this model, the study aims to boost the direct participation of stone companies and encourage the use of LCA tool to assess and improve the stone sector environmental sustainability. At the same time, the realization of accurate Life Cycle Inventory data aims at making available, to researchers and stone experts, ILCD compliant datasets of the most significant processes and technologies related to the ornamental stone sector.

Keywords: LCA datasets, life cycle assessment, ornamental stone, stone environmental impact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1122
5255 Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques

Authors: Christopher Paterson, Richard Curry, Alan Purvis, Simon Johnson

Abstract:

Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).

Keywords: Action potential detection, Low SNR, Phase spacediagrams/trajectories, Unsupervised/no-prior knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
5254 Understanding Factors Influencing E-Government Implementation in Saudi Arabia from an Organizational Perspective

Authors: M. Alassim, M. Alfayad, E. Abbott-Halpin

Abstract:

The purpose of this paper is to explore the organizational factors influencing the implementation of the e-government project within the public sector in Saudi Arabia. This project (also known as the Yesser programme) was established in Saudi Arabia in 2005 to control the e-government transformation process. The aims of the project are to provide a collaborative environment for government organizations to implement e-government and increase effectiveness and efficiency within the public sector. This paper sheds light on the organizational factors that have delayed implementation and achievement of the government’s vision and plans for Yesser. A qualitative approach was employed to understand those factors, by conducting a series of interviews with government officials for the data collection required. The analysis of the data uncovered seven organizational factors that are needed to advance implementation of the e-government project in Saudi Arabia and other similar states.

Keywords: E-government, e-transformation, ICT, Saudi Arabia, Yesser.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 996
5253 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: Block matching, digital evidence, hash list.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
5252 Removal of Malachite Green from Aqueous Solution using Hydrilla verticillata -Optimization, Equilibrium and Kinetic Studies

Authors: R. Rajeshkannan, M. Rajasimman, N. Rajamohan

Abstract:

In this study, the sorption of Malachite green (MG) on Hydrilla verticillata biomass, a submerged aquatic plant, was investigated in a batch system. The effects of operating parameters such as temperature, adsorbent dosage, contact time, adsorbent size, and agitation speed on the sorption of Malachite green were analyzed using response surface methodology (RSM). The proposed quadratic model for central composite design (CCD) fitted very well to the experimental data that it could be used to navigate the design space according to ANOVA results. The optimum sorption conditions were determined as temperature - 43.5oC, adsorbent dosage - 0.26g, contact time - 200min, adsorbent size - 0.205mm (65mesh), and agitation speed - 230rpm. The Langmuir and Freundlich isotherm models were applied to the equilibrium data. The maximum monolayer coverage capacity of Hydrilla verticillata biomass for MG was found to be 91.97 mg/g at an initial pH 8.0 indicating that the optimum sorption initial pH. The external and intra particle diffusion models were also applied to sorption data of Hydrilla verticillata biomass with MG, and it was found that both the external diffusion as well as intra particle diffusion contributes to the actual sorption process. The pseudo-second order kinetic model described the MG sorption process with a good fitting.

Keywords: Response surface methodology, Hydrilla verticillata, malachite green, adsorption, central composite design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
5251 Estimation of Crustal Thickness within the Sokoto Basin North-Western Nigeria Using Bouguer Gravity Anomaly Data

Authors: T. T. Olugbenga, A. I. Augie

Abstract:

This research proposes an interpretation of the Bouguer’ gravity anomaly data of some parts of Sokoto basin for the estimation of crustal thickness. The study area is bounded between latitudes 1100′0″N and 1300′0″N, and longitudes 400′0″E and 600′0″E that covered Koko, Jega, B/Kebbi, Argungu, Lema, Bodinga, Tamgaza, Gunmi,Daki Takwas, Dange, Sokoto, Ilella, T/Mafara, Anka, Maru, Gusau, K/Namoda, and Sabon Birni within Sokoto, Kebbi and Zamfara state respectively. The established map of the study area was digitized in X, Y and Z format using excel software package and the digitized data were processed using Surfer version 13 software. The Moho and Conrad depths based on a relationship between Bouguer’ gravity anomaly determined crustal thickness were estimated as 35 to 37 km and 19 to 21 km, respectively. The crustal region has been categorized into: Crustal thinning zone that is the region with high gravity anomaly value due to its greater geothermal energy and also Crustal thickening zone which the region with low anomaly values due to its lower geothermal energy. Birnin kebbi, Jega, Sokoto were identified as the region of hydrocarbon potential with an estimate of 35 km thickness within the crustal region which is referred to as crustal thickening as a result of its low but sufficient geothermal energy to decompose organic matter within the region to form hydrocarbons.

Keywords: Bouguer gravity anomaly, crustal thickness, geothermal energy, hydrocarbons, Moho and Conrad Depths.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 602
5250 Energy Efficient Transmission of Image over DWT-OFDM System

Authors: Lakshmi Pujitha Dachuri, Nalini Uppala

Abstract:

In many applications retransmissions of lost packets are not permitted. OFDM is a multi-carrier modulation scheme having excellent performance which allows overlapping in frequency domain. With OFDM there is a simple way of dealing with multipath relatively simple DSP algorithms.

 In this paper, an image frame is compressed using DWT, and the compressed data is arranged in data vectors, each with equal number of coefficients. These vectors are quantized and binary coded to get the bit steams, which are then packetized and intelligently mapped to the OFDM system. Based on one-bit channel state information at the transmitter, the descriptions in order of descending priority are assigned to the currently good channels such that poorer sub-channels can only affect the lesser important data vectors. We consider only one-bit channel state information available at the transmitter, informing only about the sub-channels to be good or bad. For a good sub-channel, instantaneous received power should be greater than a threshold Pth. Otherwise, the sub-channel is in fading state and considered bad for that batch of coefficients. In order to reduce the system power consumption, the mapped descriptions onto the bad sub channels are dropped at the transmitter. The binary channel state information gives an opportunity to map the bit streams intelligently and to save a reasonable amount of power. By using MAT LAB simulation we can analysis the performance of our proposed scheme, in terms of system energy saving without compromising the received quality in terms of peak signal-noise ratio.

Keywords: Binary channel state, Channel state feedback, DWT-OFDM system, Energy saving, Fading broadcast channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2786
5249 Agro-Morphological Characterization of Vicia faba L. Accessions in the Kingdom of Saudi Arabia

Authors: Zia Amjad, Salem S. Alghamdi

Abstract:

The study was conducted at the student educational farm at the College of Food and Agriculture in the Kingdom of Saudi Arabia. The aim of study was to characterize 154 Vicia faba L. accessions using agro-morphological traits based on The International Union for the Protection of New Varieties of Plants (UPOV) and The International Board for Plant Genetic Resources (IBPGR) descriptors. This research is significant as it contributes to the understanding of the genetic diversity and potential yield of V. faba in Saudi Arabia. In the study, 24 agro-morphological characters including 11 quantitative and 13 qualitative were observed for genetic variation. All the results were analyzed using multivariate analysis i.e., principal component analysis (PCA). First, six principal components (PC) had eigenvalues greater than one; accounted for 72% of available V. faba genetic diversity. However, first three components revealed more than 10% of genetic diversity each i.e., 22.36%, 15.86% and 10.89% respectively. PCA distributed the V. faba accessions into different groups based on their performance for the characters under observation. PC-1, which represented 22.36% of the genetic diversity, was positively associated with stipule spot pigmentation, intensity of streaks, pod degree of curvature and to some extent with 100 seed weight. PC-2 covered 15.86 of the genetic diversity and showed positive association for average seed weight per plant, pod length, number of seeds per plant, 100 seed weight, stipule spot pigmentation, intensity of streaks (same as in PC-1) and to some extent for pod degree of curvature and number of pods per plant. PC-3 revealed 10.89% of genetic diversity and expressed positive association for number of pods per plant and number of leaflets per plant. This study contributes to the understanding of the genetic diversity and potential yield of V. faba in the Kingdom of Saudi Arabia. By establishing a core collection of V. faba, the research provides a valuable resource for future conservation and utilization of this crop worldwide.

Keywords: Agro-morphological characterization, genetic diversity, core collection, PCA, Vicia faba L.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154
5248 Investigation on Fluid Flow Characteristics of the Orifice in Nuclear Power Plant

Authors: Nam-Seok Kim, Sang-Kyu Lee, Byung-Soo Shin, O-Hyun Keum

Abstract:

The present paper represents a methodology for investigating flow characteristics near orifice plate by using a commercial computational fluid dynamics code. The flow characteristics near orifice plate which is located in the auxiliary feedwater system were modeled via three different levels of grid and four different types of Reynolds Averaged Navier-Stokes (RANS) equations with proper near-wall treatment. The results from CFD code were compared with experimental data in terms of differential pressure through the orifice plate. In this preliminary study, the Realizable k-ε and the Reynolds stress models with enhanced wall treatment were suitable to analyze flow characteristics near orifice plate, and the results had a good agreement with experimental data.

Keywords: Auxiliary Feedwater, Computational Fluid Dynamics, Orifice, Nuclear Power Plant

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2462
5247 Energy Efficient Clustering and Data Aggregation in Wireless Sensor Networks

Authors: Surender Kumar Soni

Abstract:

Wireless Sensor Networks (WSNs) are wireless networks consisting of number of tiny, low cost and low power sensor nodes to monitor various physical phenomena like temperature, pressure, vibration, landslide detection, presence of any object, etc. The major limitation in these networks is the use of nonrechargeable battery having limited power supply. The main cause of energy consumption WSN is communication subsystem. This paper presents an efficient grid formation/clustering strategy known as Grid based level Clustering and Aggregation of Data (GCAD). The proposed clustering strategy is simple and scalable that uses low duty cycle approach to keep non-CH nodes into sleep mode thus reducing energy consumption. Simulation results demonstrate that our proposed GCAD protocol performs better in various performance metrics.

Keywords: Ad hoc network, Cluster, Grid base clustering, Wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3111
5246 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms

Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano

Abstract:

In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general-purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.

Keywords: Heuristic, MIP model, Remedial course, School, Timetabling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
5245 Identifying Business Opportunities Based on Patent and Trademark Portfolios: A Technology-Based Service Industry Case

Authors: Mingook Lee, Sungjoo Lee

Abstract:

As technology-based service industries grow drastically worldwide; companies are recognizing the importance of market preoccupancy and have made an effort to capture a large market to gain the upper hand. To this end, a focus on patents can be used to determine the properties of a technology, as well as to capture advantages in technical skills, in comparison with the firm’s competitors. However, technology-based services largely depend not only on their technological value but also their economic value, due to the recognized worth that is passed to a plurality of users. Thus, it is important to determine whether there are any competitors in the target areas and what services they provide in any field. Despite this importance, little effort has been made to systematically benchmark competitors in order to identify business opportunities. Thus, this study aims to not only identify each position of technology-centered service companies in complex market dynamics, but also to discover new business opportunities. For this, we try to consider both technology and market environments simultaneously by utilizing patent data as a representative proxy for technology and trademark dates as an index for a firm’s target goods and services. Theoretically, this is one of the earliest attempts to combine patent data and trademark data to analyze corporate strategies. In practice, the research results are expected to be used as a decision criterion to diagnose the economic value that companies can obtain by entering the market, as well as the technological value to be passed onto their customers. Thus, the proposed approach can be useful to support effective technology and business strategies in a firm.

Keywords: Business opportunity, patent, Portfolio analysis, trademark.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
5244 Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Authors: Darren Thompson, Philip Morrow, Bryan Scotney, John Winder

Abstract:

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Keywords: Classification, data fusion, diabetic foot, stereophotogrammetry, tissue colour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688