Search results for: analysis of scientific data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13610

Search results for: analysis of scientific data

12230 Impacts of E-Learning on Educational Policy: Policy of Sensitization and Training in E-Learning in Saudi Arabia

Authors: Layla Albdr

Abstract:

Saudi Arabia instituted the policy of sensitizing and training stakeholders for e-learning and witnessed wide adoption in many institutions. However, it is at the infancy stage and needs time to develop to mirror the US and UK. The majority of the higher education institutions in Saudi Arabia have adopted e-learning as an alternative to traditional methods to advance education. Conversely, effective implementation of the policy of sensitization and training of stakeholders for e-learning implementation has not been attained because of various challenges. The objectives included determining the challenges and opportunities of the e-learning policy of sensitization and training of stakeholders in Saudi Arabia's higher education and examining if sensitization and training of stakeholder's policy will help promote the implementation of e-learning in institutions. The study employed a descriptive research design based on qualitative analysis. The researcher recruited 295 students and 60 academic staff from four Saudi Arabian universities to participate in the study. An online questionnaire was used to collect the data. The data were then analyzed and reported both quantitatively and qualitatively. The analysis provided an in-depth understanding of the opportunities and challenges of e-learning policy in Saudi Arabian universities. The main challenges identified as internal challenges were the lack of educators’ interest in adopting the policy, and external challenges entailed lack of ICT infrastructure and Internet connectivity. The study recommends encouraging, sensitizing, and training all stakeholders to address these challenges and adopt the policy.

Keywords: e-learning, educational policy, Saudi Arabian higher education, policy of sensitization and training

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 567
12229 Evaluation of Solid Phase Micro-extraction with Standard Testing Method for Formaldehyde Determination

Authors: Y. L. Yung, Kong Mun Lo

Abstract:

In this study, solid phase micro-extraction (SPME) was optimized to improve the sensitivity and accuracy in formaldehyde determination for plywood panels. Further work has been carried out to compare the newly developed technique with existing method which reacts formaldehyde collected in desiccators with acetyl acetone reagent (DC-AA). In SPME, formaldehyde was first derivatized with O-(2,3,4,5,6 pentafluorobenzyl)-hydroxylamine hydrochloride (PFBHA) and analysis was then performed by gas chromatography in combination with mass spectrometry (GC-MS). SPME data subjected to various wood species gave satisfactory results, with relative standard deviations (RSDs) obtained in the range of 3.1-10.3%. It was also well correlated with DC values, giving a correlation coefficient, RSQ, of 0.959. The quantitative analysis of formaldehyde by SPME was an alternative in wood industry with great potential

Keywords: Formaldehyde, GCMS, Plywood and SPME

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2518
12228 A Mobile Agent-based Clustering Data Fusion Algorithm in WSN

Authors: Xiangbin Zhu, Wenjuan Zhang

Abstract:

In wireless sensor networks,the mobile agent technology is used in data fusion. According to the node residual energy and the results of partial integration,we design the node clustering algorithm. Optimization of mobile agent in the routing within the cluster strategy for wireless sensor networks to further reduce the amount of data transfer. Through the experiments, using mobile agents in the integration process within the cluster can be reduced the path loss in some extent.

Keywords: wireless sensor networks, data fusion, mobile agent

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
12227 Making Data Structures and Algorithms more Understandable by Programming Sudoku the Human Way

Authors: Roelien Goede

Abstract:

Data Structures and Algorithms is a module in most Computer Science or Information Technology curricula. It is one of the modules most students identify as being difficult. This paper demonstrates how programming a solution for Sudoku can make abstract concepts more concrete. The paper relates concepts of a typical Data Structures and Algorithms module to a step by step solution for Sudoku in a human type as opposed to a computer oriented solution.

Keywords: Data Structures, Algorithms, Sudoku, ObjectOriented Programming, Programming Teaching, Education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3072
12226 Virtual Speaking Head for Hearing Impaired Students

Authors: Eva Pajorová, Ladislav Hluchý

Abstract:

Developed tool is one of system tools for easier access to various scientific areas and real time interactive learning between lecturer and for hearing impaired students. There is no demand for the lecturer to know Sign Language (SL). Instead, the new software tools will perform the translation of the regular speech into SL, after which it will be transferred to the student. On the other side, the questions of the student (in SL) will be translated and transferred to the lecturer in text or speech. One of those tools is presented tool. It-s too for developing the correct Speech Visemes as a root of total communication method for hearing impared students.

Keywords: Impared people, sing language, communication methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
12225 House Indoor Thermal and Health Conditions with Different Passive Designs

Authors: Bin Su

Abstract:

According to the Auckland climate, building passive design more focus on improving winter indoor thermal and health conditions. Based on field study data of indoor air temperature and relative humidity close to ceiling and floor of an insulated Auckland townhouse with and without a whole home mechanical ventilation system, this study is to analysis variation of indoor microclimate data of an Auckland townhouse using or not using the mechanical ventilation system to evaluate winter indoor thermal and health conditions for the future house design with a mechanical ventilation system.

Keywords: House ventilation, indoor thermal condition, indoor health condition, passive design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
12224 The Experimental and Numerical Analysis of a Lightpipe using a Simulation Software

Authors: M. Paroncini, F. Corvaro, G. Nardini, S. Pistolesi

Abstract:

A lightpipe is an about 99 percent specular reflective mirror pipe or duct that is used for the transmission of the daylight from the outside into a building. The lightpipes are usually used in the daylighting buildings, in the residential, industrial and commercial sectors. This paper is about the performances of a lightpipe installed in a laboratory (3 m x 2.6 m x 3 m) without windows. The aim is to analyse the luminous intensity distribution for several sky/sun conditions. The lightpipe was monitored during the year 2006. The lightpipe is 1 m long and the diameter of the top collector and of the internal diffuser device is 0.25 m. In the laboratory there are seven illuminance sensors: one external is located on the roof of the laboratory and six internal sensors are connected to a data acquisition system. The internal sensors are positioned under the internal diffusive device at an height of 0.85 m from the floor to simulate a working plane. The numerical data are obtained through a simulation software. This paper shows the comparison between the experimental and numerical results concerning the behavior of the lightpipe.

Keywords: Daylighting, Desktop Radiance, Lightpipe.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
12223 Transmit Sub-aperture Optimization in MSTA Ultrasound Imaging Method

Authors: YuriyTasinkevych, Ihor Trots, AndrzejNowicki, Marcin Lewandowski

Abstract:

The paper presents the optimization problem for the multi-element synthetic transmit aperture method (MSTA) in ultrasound imaging applications. The optimal choice of the transmit aperture size is performed as a trade-off between the lateral resolution, penetration depth and the frame rate. Results of the analysis obtained by a developed optimization algorithm are presented. Maximum penetration depth and the best lateral resolution at given depths are chosen as the optimization criteria. The optimization algorithm was tested using synthetic aperture data of point reflectors simulated by Filed II program for Matlab® for the case of 5MHz 128-element linear transducer array with 0.48 mm pitch are presented. The visualization of experimentally obtained synthetic aperture data of a tissue mimicking phantom and in vitro measurements of the beef liver are also shown. The data were obtained using the SonixTOUCH Research systemequipped with a linear 4MHz 128 element transducerwith 0.3 mm element pitch, 0.28 mm element width and 70% fractional bandwidth was excited by one sine cycle pulse burst of transducer's center frequency.

Keywords: synthetic aperture method, ultrasound imaging, beamforming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
12222 Construction Of Decentralized Lifetime Maximizing Tree for Data Aggregation in Wireless Sensor Networks

Authors: Deepali Virmani , Satbir Jain

Abstract:

To meet the demands of wireless sensor networks (WSNs) where data are usually aggregated at a single source prior to transmitting to any distant user, there is a need to establish a tree structure inside any given event region. In this paper , a novel technique to create one such tree is proposed .This tree preserves the energy and maximizes the lifetime of event sources while they are constantly transmitting for data aggregation. The term Decentralized Lifetime Maximizing Tree (DLMT) is used to denote this tree. DLMT features in nodes with higher energy tend to be chosen as data aggregating parents so that the time to detect the first broken tree link can be extended and less energy is involved in tree maintenance. By constructing the tree in such a way, the protocol is able to reduce the frequency of tree reconstruction, minimize the amount of data loss ,minimize the delay during data collection and preserves the energy.

Keywords: branch energy, decentralized, energy level , lifetime, tree energy, wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467
12221 Three-Dimensional Numerical Investigation for Reinforced Concrete Slabs with Opening

Authors: Abdelrahman Elsehsah, Hany Madkour, Khalid Farah

Abstract:

This article presents a 3-D modified non-linear elastic model in the strain space. The Helmholtz free energy function is introduced with the existence of a dissipation potential surface in the space of thermodynamic conjugate forces. The constitutive equation and the damage evolution were derived as well. The modified damage has been examined to model the nonlinear behavior of reinforced concrete (RC) slabs with an opening. A parametric study with RC was carried out to investigate the impact of different factors on the behavior of RC slabs. These factors are the opening area, the opening shape, the place of opening, and the thickness of the slabs. And the numerical results have been compared with the experimental data from literature. Finally, the model showed its ability to be applied to the structural analysis of RC slabs.

Keywords: 3-D numerical analysis, damage mechanics, RC slab with opening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854
12220 Effects of Data Correlation in a Sparse-View Compressive Sensing Based Image Reconstruction

Authors: Sajid Abbas, Joon Pyo Hong, Jung-Ryun Lee, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: Computed tomography, Computed laminography, Compressive sending, Low-dose.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
12219 Automatic Threshold Search for Heat Map Based Feature Selection: A Cancer Dataset Analysis

Authors: Carlos Huertas, Reyes Juarez-Ramirez

Abstract:

Public health is one of the most critical issues today; therefore, there is great interest to improve technologies in the area of diseases detection. With machine learning and feature selection, it has been possible to aid the diagnosis of several diseases such as cancer. In this work, we present an extension to the Heat Map Based Feature Selection algorithm, this modification allows automatic threshold parameter selection that helps to improve the generalization performance of high dimensional data such as mass spectrometry. We have performed a comparison analysis using multiple cancer datasets and compare against the well known Recursive Feature Elimination algorithm and our original proposal, the results show improved classification performance that is very competitive against current techniques.

Keywords: Feature selection, mass spectrometry, biomarker discovery, cancer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
12218 Statistical Estimation of Spring-back Degree Using Texture Database

Authors: Takashi Sakai, Shinsaku Kikuta, Jun-ichi Koyama

Abstract:

Using a texture database, a statistical estimation of spring-back was conducted in this study on the basis of statistical analysis. Both spring-back in bending deformation and experimental data related to the crystal orientation show significant dispersion. Therefore, a probabilistic statistical approach was established for the proper quantification of these values. Correlation was examined among the parameters F(x) of spring-back, F(x) of the buildup fraction to three orientations after 92° bending, and F(x) at an as-received part on the basis of the three-parameter Weibull distribution. Consequent spring-back estimation using a texture database yielded excellent estimates compared with experimental values.

Keywords: Bending, Spring-back, Database, Crystallographic Orientation, Texture, SEM-EBSD, Weibull distribution, Statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878
12217 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: Real-Time Spatial Big Data, Quality Of Service, Vertical partitioning, Horizontal partitioning, Matching algorithm, Hamming distance, Stream query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1031
12216 Viral Advertising: Popularity and Willingness to Share among the Czech Internet Population

Authors: Martin Klepek

Abstract:

This paper presents results of primary quantitative research on viral advertising with focus on popularity and willingness to share viral video among Czech Internet population. It starts with brief theoretical debate on viral advertising, which is used for the comparison of the results. For purpose of collecting data, online questionnaire survey was given to 384 respondents. Statistics utilized in this research included frequency, percentage, correlation and Pearson’s Chi-square test. Data was evaluated using SPSS software. The research analysis disclosed high popularity of viral advertising video among Czech Internet population but implies lower willingness to share it. Significant relationship between likability of viral video technique and age of the viewer was found.

Keywords: Internet advertising, Internet population, promotion, marketing communication, viral advertising, viral video.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
12215 Prediction of Soil Hydraulic Conductivity from Particle-Size Distribution

Authors: A.F. Salarashayeri, M. Siosemarde

Abstract:

Hydraulic conductivity is one parameter important for predicting the movement of water and contaminants dissolved in the water through the soil. The hydraulic conductivity is measured on soil samples in the lab and sometimes tests carried out in the field. The hydraulic conductivity has been related to soil particle diameter by a number of investigators. In this study, 25 set of soil samples with sand texture. The results show approximately success in predicting hydraulic conductivity from particle diameters data. The following relationship obtained from multiple linear regressions on data (R2 = 0.52): Where d10, d50 and d60, are the soil particle diameter (mm) that 10%, 50% and 60% of all soil particles are finer (smaller) by weight and Ks, saturated hydraulic conductivity is expressed in m/day. The results of regression analysis showed that d10 play a more significant role with respect to Ks, saturated hydraulic conductivity (m/day), and has been named as the effective parameter in Ks calculation.

Keywords: hydraulic conductivity, particle diameter, particle-size distribution and soil

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8949
12214 Experiment and Simulation of Laser Effect on Thermal Field of Porcine Liver

Authors: K.Ting, K. T. Chen, Y. L. Su, C. J. Chang

Abstract:

In medical therapy, laser has been widely used to conduct cosmetic, tumor and other treatments. During the process of laser irradiation, there may be thermal damage caused by excessive laser exposure. Thus, the establishment of a complete thermal analysis model is clinically helpful to physicians in reference data. In this study, porcine liver in place of tissue was subjected to laser irradiation to set up the experimental data considering the explored impact on surface thermal field and thermal damage region under different conditions of power, laser irradiation time, and distance between laser and porcine liver. In the experimental process, the surface temperature distribution of the porcine lever was measured by the infrared thermal imager. In the part of simulation, the bio heat transfer Pennes-s equation was solved by software SYSWELD applying in welding process. The double ellipsoid function as a laser source term is firstly considered in the prediction for surface thermal field and internal tissue damage. The simulation results are compared with the experimental data to validate the mathematical model established here in.

Keywords: laser infrared thermal imager, bio-heat transfer, double ellipsoid function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
12213 The System Architecture of the Open European Nephrology Science Centre

Authors: G. Lindemann, D. Schmidt, T. Schrader, M. Beil, T. Schaaf, H.-D. Burkhard

Abstract:

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary research, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charite Medical School in Berlin has established together with the German Research Foundation (DFG) a new information service center for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). The system is based on a service-oriented architecture (SOA) with main and auxiliary modules arranged in four layers. To improve the reuse and efficient arrangement of the services the functionalities are described as business processes using the standardised Business Process Execution Language (BPEL).

Keywords: Software development management, Business dataprocessing, Knowledge based systems in medicine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1457
12212 Optimizing Spatial Trend Detection By Artificial Immune Systems

Authors: M. Derakhshanfar, B. Minaei-Bidgoli

Abstract:

Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.

Keywords: Spatial Data Mining, Spatial Trend Detection, Heuristic Methods, Artificial Immune System, Clonal Selection Algorithm (CSA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
12211 Data Mining for Cancer Management in Egypt Case Study: Childhood Acute Lymphoblastic Leukemia

Authors: Nevine M. Labib, Michael N. Malek

Abstract:

Data Mining aims at discovering knowledge out of data and presenting it in a form that is easily comprehensible to humans. One of the useful applications in Egypt is the Cancer management, especially the management of Acute Lymphoblastic Leukemia or ALL, which is the most common type of cancer in children. This paper discusses the process of designing a prototype that can help in the management of childhood ALL, which has a great significance in the health care field. Besides, it has a social impact on decreasing the rate of infection in children in Egypt. It also provides valubale information about the distribution and segmentation of ALL in Egypt, which may be linked to the possible risk factors. Undirected Knowledge Discovery is used since, in the case of this research project, there is no target field as the data provided is mainly subjective. This is done in order to quantify the subjective variables. Therefore, the computer will be asked to identify significant patterns in the provided medical data about ALL. This may be achieved through collecting the data necessary for the system, determimng the data mining technique to be used for the system, and choosing the most suitable implementation tool for the domain. The research makes use of a data mining tool, Clementine, so as to apply Decision Trees technique. We feed it with data extracted from real-life cases taken from specialized Cancer Institutes. Relevant medical cases details such as patient medical history and diagnosis are analyzed, classified, and clustered in order to improve the disease management.

Keywords: Data Mining, Decision Trees, Knowledge Discovery, Leukemia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
12210 Quality of Concrete of Recent Development Projects in Libya

Authors: Mohamed .S .Alazhari, Milad. M. Al Shebani

Abstract:

Numerous concrete structures projects are currently running in Libya as part of a US$50 billion government funding. The quality of concrete used in 20 different construction projects were assessed based mainly on the concrete compressive strength achieved. The projects are scattered all over the country and are at various levels of completeness. For most of these projects, the concrete compressive strength was obtained from test results of a 150mm standard cube mold. Statistical analysis of collected concrete compressive strengths reveals that the data in general followed a normal distribution pattern. The study covers comparison and assessment of concrete quality aspects such as: quality control, strength range, data standard deviation, data scatter, and ratio of minimum strength to design strength. Site quality control for these projects ranged from very good to poor according to ACI214 criteria [1]. The ranges (Rg) of the strength (max. strength – min. strength) divided by average strength are from (34% to 160%). Data scatter is measured as the range (Rg) divided by standard deviation () and is found to be (1.82 to 11.04), indicating that the range is ±3σ. International construction companies working in Libya follow different assessment criteria for concrete compressive strength in lieu of national unified procedure. The study reveals that assessments of concrete quality conducted by these construction companies usually meet their adopted (internal) standards, but sometimes fail to meet internationally known standard requirements. The assessment of concrete presented in this paper is based on ACI, British standards and proposed Libyan concrete strength assessment criteria.

Keywords: Acceptance criteria, Concrete, Compressive strength, quality control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
12209 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, S. Pradhan

Abstract:

Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.

Keywords: Accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3558
12208 Corporate Cultures Management towards the Retention of Employees: Case Study Company in Thailand

Authors: Duangsamorn Rungsawanpho

Abstract:

The objectives of this paper are to explore the corporate cultures management as determinants of employee retention company in Thailand. This study using mixed method methodology. Data collection using questionnaires and in-depth interviews. The statistics used for data analysis were percentage, mean, standard deviation and inferential statistics will include. The results show that the corporate management culture is perfect for any organization but it depends on the business and the industry because the situations or circumstances that corporate executives are met is different. Because the finding explained that the employees of the company determine the achievement of value-oriented by the corporate culture and international relations is perceived most value for their organizations. In additional we found the employees perceiving with participation can be interpreted as a positive example, many employees feel that they are part of management because they care about their opinions or ideas related with their work.

Keywords: Corporate culture, employee retention, retention of employees, management approaches.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 701
12207 A Data Warehouse System to Help Assist Breast Cancer Screening in Diagnosis, Education and Research

Authors: Souâd Demigha

Abstract:

Early detection of breast cancer is considered as a major public health issue. Breast cancer screening is not generalized to the entire population due to a lack of resources, staff and appropriate tools. Systematic screening can result in a volume of data which can not be managed by present computer architecture, either in terms of storage capabilities or in terms of exploitation tools. We propose in this paper to design and develop a data warehouse system in radiology-senology (DWRS). The aim of such a system is on one hand, to support this important volume of information providing from multiple sources of data and images and for the other hand, to help assist breast cancer screening in diagnosis, education and research.

Keywords: Breast cancer screening, data warehouse, diagnosis, education, research.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
12206 Performance Comparison of Parallel Sorting Algorithms on the Cluster of Workstations

Authors: Lai Lai Win Kyi, Nay Min Tun

Abstract:

Sorting appears the most attention among all computational tasks over the past years because sorted data is at the heart of many computations. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. Many parallel sorting algorithms have been investigated for a variety of parallel computer architectures. In this paper, three parallel sorting algorithms have been implemented and compared in terms of their overall execution time. The algorithms implemented are the odd-even transposition sort, parallel merge sort and parallel rank sort. Cluster of Workstations or Windows Compute Cluster has been used to compare the algorithms implemented. The C# programming language is used to develop the sorting algorithms. The MPI (Message Passing Interface) library has been selected to establish the communication and synchronization between processors. The time complexity for each parallel sorting algorithm will also be mentioned and analyzed.

Keywords: Cluster of Workstations, Parallel sorting algorithms, performance analysis, parallel computing and MPI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
12205 Data Security in a DApp Twitter Alike on Web 3.0 With Blockchain Based Technology

Authors: Vishal Awasthi, Tanya Soni, Vigya Awasthi, Swati Singh, Shivali Verma

Abstract:

There is a growing demand for a network that grants a high level of data security and confidentiality. For this reason, the semantic web was introduced, which allows data to be shared and reused across applications while safeguarding users privacy and user’s will grab back control of their data. The earlier Web 1.0 and Web 2.0 versions were built on client-server architecture, in  which there was the risk of data theft and unconsented sale of user data. A decentralized version, Known as Web 3.0, that is mostly built on blockchain technology was interjected to resolve these issues. The recent research focuses on blockchain technology, deals with privacy, security, transparency, and innovation of decentralized applications (DApps), e.g. a Twitter Clone, Whatsapp clone. In this paper the Twitter Alike built on the Ethereum blockchain will replace traditional techniques with improved latency, throughput, and data ownership. The central principle of this DApp is smart contract implemented using Solidity which is an object- oriented and highlevel language. Consequently, this will provide a better Quality Services, high data security, and integrity for both present and future internet technologies.

Keywords: Blockchain, DApps, Ethereum, Semantic Web, Smart Contract, Solidity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 252
12204 Analysis of the Physical Behavior of Library Users in Reading Rooms through GIS: A Case Study of the Central Library of Tehran University

Authors: R. Pournaghi

Abstract:

Taking into account the significance of measuring the daily use of the study space in the libraries in order to develop and reorganize the space for enhancing the efficiency of the study space, the current study aimed to apply GIS in analyzing the study halls of the Central Library and Document Center of Tehran University in order to determine how study desks and chairs were used by the students. The study used a combination of survey-descriptive and system design method. In order to gather the required data, surveydescriptive method was used. For implementing and entering data into ArcGIS and analyzing the data and displaying the results on the maps of the study halls of the library, system design method was utilized. The design of the spatial database of the use of the study halls was measured through the extent of occupancy of the space by the library users and the maps of the study halls of the central library of Tehran University as the case study. The results showed that Abooreyhan hall had the highest rate of occupancy of the desks and chairs compared to the other halls. The Hall of Science and Technology, with an average occupancy rate of 0.39 for the tables represented the lowest number of users and Rashid al-Dins hall, and Science and Technology hall with an average occupancy rate (0.40) had the lowest number of users for seats. In this study, the comparison of the space occupied at different periods in the morning, evenings, afternoons, and several months was performed through GIS. This system analyzed the space relationships effectively and efficiently. The output of this study would be used by administrators and librarians to determine the exact extent of use of the equipment of the study halls and librarians can use the output map to design the space more efficiently at the library.

Keywords: Geospatial Information System, Spatial analysis, Reading Room, Academic libraries, Library’s User, Central Library of Tehran University.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
12203 A Numerical Model for Simulation of Blood Flow in Vascular Networks

Authors: Houman Tamaddon, Mehrdad Behnia, Masud Behnia

Abstract:

An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.

Keywords: Blood flow, Morphometric data, Vascular tree, Strahler ordering system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080
12202 The Efficacy of Neurological Impress Method and Repeated Reading on Reading Fluency of Children with Learning Disabilities in Oyo State, Nigeria

Authors: A. O. Oladele

Abstract:

The purpose of this study was to find out the effectiveness of neurological impress method and repeated reading technique on reading fluency of children with learning disabilities. Thirty primary four pupils in three public primary schools participated in the study. There were two experimental groups and a control. This research employed a 3 by 2 factorial matrix and the participants were taught for one session. Two hypotheses were formulated to guide the research. T-test was used to analyse the data gathered, and data analysis revealed that pupils exposed to the two treatment strategies had improvement in their reading fluency. It was recommended that the two strategies used in the study can be used to intervene in reading fluency problems in children with learning disabilities.

Keywords: Learning disabilities, neurological impress method, repeated reading, reading fluency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3774
12201 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study

Authors: Faisal Aburub, Wael Hadi

Abstract:

Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.

Keywords: Classification, data mining, evaluation measures, groundwater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2570