Search results for: Applications of Computer Science.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4028

Search results for: Applications of Computer Science.

248 Design and Implementation of a Counting and Differentiation System for Vehicles through Video Processing

Authors: Derlis Gregor, Kevin Cikel, Mario Arzamendia, Raúl Gregor

Abstract:

This paper presents a self-sustaining mobile system for counting and classification of vehicles through processing video. It proposes a counting and classification algorithm divided in four steps that can be executed multiple times in parallel in a SBC (Single Board Computer), like the Raspberry Pi 2, in such a way that it can be implemented in real time. The first step of the proposed algorithm limits the zone of the image that it will be processed. The second step performs the detection of the mobile objects using a BGS (Background Subtraction) algorithm based on the GMM (Gaussian Mixture Model), as well as a shadow removal algorithm using physical-based features, followed by morphological operations. In the first step the vehicle detection will be performed by using edge detection algorithms and the vehicle following through Kalman filters. The last step of the proposed algorithm registers the vehicle passing and performs their classification according to their areas. An auto-sustainable system is proposed, powered by batteries and photovoltaic solar panels, and the data transmission is done through GPRS (General Packet Radio Service)eliminating the need of using external cable, which will facilitate it deployment and translation to any location where it could operate. The self-sustaining trailer will allow the counting and classification of vehicles in specific zones with difficult access.

Keywords: Intelligent transportation systems, object detection, video processing, road traffic, vehicle counting, vehicle classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
247 Personas Help Understand Users’ Needs, Goals and Desires in an Online Institutional Repository

Authors: Maha ALjohani, James Blustein

Abstract:

Communicating users' needs, goals and problems help designers and developers overcome challenges faced by end users. Personas are used to represent end users’ needs. In our research, creating personas allowed the following questions to be answered: Who are the potential user groups? What do they want to achieve by using the service? What are the problems that users face? What should the service provide to them? To develop realistic personas, we conducted a focus group discussion with undergraduate and graduate students and also interviewed a university librarian. The personas were created to help evaluating the Institutional Repository that is based on the DSpace system. The profiles helped to communicate users' needs, abilities, tasks, and problems, and the task scenarios used in the heuristic evaluation were based on these personas. Four personas resulted of a focus group discussion with undergraduate and graduate students and from interviewing a university librarian. We then used these personas to create focused task-scenarios for a heuristic evaluation on the system interface to ensure that it met users' needs, goals, problems and desires. In this paper, we present the process that we used to create the personas that led to devise the task scenarios used in the heuristic evaluation as a follow up study of the DSpace university repository.

Keywords: Heuristic Evaluation, Institutional Repositories, User Experience, Human Computer Interaction, User Profiles, Personas, Task Scenarios, Heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2134
246 A Study of Two Disease Models: With and Without Incubation Period

Authors: H. C. Chinwenyi, H. D. Ibrahim, J. O. Adekunle

Abstract:

The incubation period is defined as the time from infection with a microorganism to development of symptoms. In this research, two disease models: one with incubation period and another without incubation period were studied. The study involves the use of a  mathematical model with a single incubation period. The test for the existence and stability of the disease free and the endemic equilibrium states for both models were carried out. The fourth order Runge-Kutta method was used to solve both models numerically. Finally, a computer program in MATLAB was developed to run the numerical experiments. From the results, we are able to show that the endemic equilibrium state of the model with incubation period is locally asymptotically stable whereas the endemic equilibrium state of the model without incubation period is unstable under certain conditions on the given model parameters. It was also established that the disease free equilibrium states of the model with and without incubation period are locally asymptotically stable. Furthermore, results from numerical experiments using empirical data obtained from Nigeria Centre for Disease Control (NCDC) showed that the overall population of the infected people for the model with incubation period is higher than that without incubation period. We also established from the results obtained that as the transmission rate from susceptible to infected population increases, the peak values of the infected population for the model with incubation period decrease and are always less than those for the model without incubation period.

Keywords: Asymptotic stability, incubation period, Routh-Hurwitz criterion, Runge Kutta method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 659
245 On The Analysis of a Compound Neural Network for Detecting Atrio Ventricular Heart Block (AVB) in an ECG Signal

Authors: Salama Meghriche, Amer Draa, Mohammed Boulemden

Abstract:

Heart failure is the most common reason of death nowadays, but if the medical help is given directly, the patient-s life may be saved in many cases. Numerous heart diseases can be detected by means of analyzing electrocardiograms (ECG). Artificial Neural Networks (ANN) are computer-based expert systems that have proved to be useful in pattern recognition tasks. ANN can be used in different phases of the decision-making process, from classification to diagnostic procedures. This work concentrates on a review followed by a novel method. The purpose of the review is to assess the evidence of healthcare benefits involving the application of artificial neural networks to the clinical functions of diagnosis, prognosis and survival analysis, in ECG signals. The developed method is based on a compound neural network (CNN), to classify ECGs as normal or carrying an AtrioVentricular heart Block (AVB). This method uses three different feed forward multilayer neural networks. A single output unit encodes the probability of AVB occurrences. A value between 0 and 0.1 is the desired output for a normal ECG; a value between 0.1 and 1 would infer an occurrence of an AVB. The results show that this compound network has a good performance in detecting AVBs, with a sensitivity of 90.7% and a specificity of 86.05%. The accuracy value is 87.9%.

Keywords: Artificial neural networks, Electrocardiogram(ECG), Feed forward multilayer neural network, Medical diagnosis, Pattern recognitionm, Signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2443
244 Bee Parameter Determination via Weighted Centriod Modified Simplex and Constrained Response Surface Optimisation Methods

Authors: P. Luangpaiboon

Abstract:

Various intelligences and inspirations have been adopted into the iterative searching process called as meta-heuristics. They intelligently perform the exploration and exploitation in the solution domain space aiming to efficiently seek near optimal solutions. In this work, the bee algorithm, inspired by the natural foraging behaviour of honey bees, was adapted to find the near optimal solutions of the transportation management system, dynamic multi-zone dispatching. This problem prepares for an uncertainty and changing customers- demand. In striving to remain competitive, transportation system should therefore be flexible in order to cope with the changes of customers- demand in terms of in-bound and outbound goods and technological innovations. To remain higher service level but lower cost management via the minimal imbalance scenario, the rearrangement penalty of the area, in each zone, including time periods are also included. However, the performance of the algorithm depends on the appropriate parameters- setting and need to be determined and analysed before its implementation. BEE parameters are determined through the linear constrained response surface optimisation or LCRSOM and weighted centroid modified simplex methods or WCMSM. Experimental results were analysed in terms of best solutions found so far, mean and standard deviation on the imbalance values including the convergence of the solutions obtained. It was found that the results obtained from the LCRSOM were better than those using the WCMSM. However, the average execution time of experimental run using the LCRSOM was longer than those using the WCMSM. Finally a recommendation of proper level settings of BEE parameters for some selected problem sizes is given as a guideline for future applications.

Keywords: Meta-heuristic, Bee Algorithm, Dynamic Multi-Zone Dispatching, Linear Constrained Response SurfaceOptimisation Method, Weighted Centroid Modified Simplex Method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1349
243 Development System for Emotion Detection Based on Brain Signals and Facial Images

Authors: Suprijanto, Linda Sari, Vebi Nadhira , IGN. Merthayasa. Farida I.M

Abstract:

Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.

Keywords: Multimodal Emotion Detection, EEG, Facial Image, Optical Flow, compass mapping, Brain Wave

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262
242 Computer Study of Cluster Mechanism of Anti-greenhouse Effect

Authors: A. Galashev

Abstract:

Absorption spectra of infra-red (IR) radiation of the disperse water medium absorbing the most important greenhouse gases: CO2 , N2O , CH4 , C2H2 , C2H6 have been calculated by the molecular dynamics method. Loss of the absorbing ability at the formation of clusters due to a reduction of the number of centers interacting with IR radiation, results in an anti-greenhouse effect. Absorption of O3 molecules by the (H2O)50 cluster is investigated at its interaction with Cl- ions. The splitting of ozone molecule on atoms near to cluster surface was observed. Interaction of water cluster with Cl- ions causes the increase of integrated intensity of emission spectra of IR radiation, and also essential reduction of the similar characteristic of Raman spectrum. Relative integrated intensity of absorption of IR radiation for small water clusters was designed. Dependences of the quantity of weight on altitude for vapor of monomers, clusters, droplets, crystals and mass of all moisture were determined. The anti-greenhouse effect of clusters was defined as the difference of increases of average global temperature of the Earth, caused by absorption of IR radiation by free water molecules forming clusters, and absorption of clusters themselves. The greenhouse effect caused by clusters makes 0.53 K, and the antigreenhouse one is equal to 1.14 K. The increase of concentration of CO2 in the atmosphere does not always correlate with the amplification of greenhouse effect.

Keywords: Greenhouse gases, infrared absorption and Raman spectra, molecular dynamics method, water clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
241 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet

Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia

Abstract:

Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical generic sync middleware of low maintenance and operation costs is most wanted. To this demand, this paper presented a generic sync middleware system (GSMS), which has been developed, applied and optimized since 2006, holding the principles or advantages that it must be SyncML-compliant and transparent to data application layer logic without referring to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence of low cost. Regarding these hard commitments of developing GSMS, in this paper we stressed the significant optimization breakthrough of GSMS sync delay being well below a fraction of millisecond per record sync. A series of ultimate tests with GSMS sync performance were conducted for a persuasive example, in which the source relational database underwent a broad range of write loads (from one thousand to one million intensive writes within a few minutes). All these tests showed that the performance of GSMS is competent and smooth even under ultimate write loads.

Keywords: Heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 408
240 Simulation of the Pedestrian Flow in the Tawaf Area Using the Social Force Model

Authors: Zarita Zainuddin, Kumatha Thinakaran, Mohammed Shuaib

Abstract:

In today-s modern world, the number of vehicles is increasing on the road. This causes more people to choose walking instead of traveling using vehicles. Thus, proper planning of pedestrians- paths is important to ensure the safety of pedestrians in a walking area. Crowd dynamics study the pedestrians- behavior and modeling pedestrians- movement to ensure safety in their walking paths. To date, many models have been designed to ease pedestrians- movement. The Social Force Model is widely used among researchers as it is simpler and provides better simulation results. We will discuss the problem regarding the ritual of circumambulating the Ka-aba (Tawaf) where the entrances to this area are usually congested which worsens during the Hajj season. We will use the computer simulation model SimWalk which is based on the Social Force Model to simulate the movement of pilgrims in the Tawaf area. We will first discuss the effect of uni and bi-directional flows at the gates. We will then restrict certain gates to the area as the entrances only and others as exits only. From the simulations, we will study the effect of the distance of other entrances from the beginning line and their effects on the duration of pilgrims circumambulate Ka-aba. We will distribute the pilgrims at the different entrances evenly so that the congestion at the entrances can be reduced. We would also discuss the various locations and designs of barriers at the exits and its effect on the time taken for the pilgrims to exit the Tawaf area.

Keywords: circumambulation, Ka'aba, pedestrian flow, SFM, Tawaf , entrance, exit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
239 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, unmanned aerial vehicle, UAV, random, Kriging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 762
238 Numerical Solution of Transient Natural Convection in Vertical Heated Rectangular Channel between Two Vertical Parallel MTR-Type Fuel Plates

Authors: Djalal Hamed

Abstract:

The aim of this paper is to perform, by mean of the finite volume method, a numerical solution of the transient natural convection in a narrow rectangular channel between two vertical parallel Material Testing Reactor (MTR)-type fuel plates, imposed under a heat flux with a cosine shape to determine the margin of the nuclear core power at which the natural convection cooling mode can ensure a safe core cooling, where the cladding temperature should not reach a specific safety limits (90 °C). For this purpose, a computer program is developed to determine the principal parameters related to the nuclear core safety, such as the temperature distribution in the fuel plate and in the coolant (light water) as a function of the reactor core power. Throughout the obtained results, we noticed that the core power should not reach 400 kW, to ensure a safe passive residual heat removing from the nuclear core by the upward natural convection cooling mode.

Keywords: Buoyancy force, friction force, friction factor, finite volume method, transient natural convection, thermal hydraulic analysis, vertical heated rectangular channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 715
237 MATLAB-based System for Centralized Monitoring and Self Restoration against Fiber Fault in FTTH

Authors: Mohammad Syuhaimi Ab-Rahman, Boonchuan Ng, Kasmiran Jumari

Abstract:

This paper presented a MATLAB-based system named Smart Access Network Testing, Analyzing and Database (SANTAD), purposely for in-service transmission surveillance and self restoration against fiber fault in fiber-to-the-home (FTTH) access network. The developed program will be installed with optical line terminal (OLT) at central office (CO) to monitor the status and detect any fiber fault that occurs in FTTH downwardly from CO towards residential customer locations. SANTAD is interfaced with optical time domain reflectometer (OTDR) to accumulate every network testing result to be displayed on a single computer screen for further analysis. This program will identify and present the parameters of each optical fiber line such as the line's status either in working or nonworking condition, magnitude of decreasing at each point, failure location, and other details as shown in the OTDR's screen. The failure status will be delivered to field engineers for promptly actions, meanwhile the failure line will be diverted to protection line to ensure the traffic flow continuously. This approach has a bright prospect to improve the survivability and reliability as well as increase the efficiency and monitoring capabilities in FTTH.

Keywords: MATLAB, SANTAD, in-service transmission surveillance, self restoration, fiber fault, FTTH

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
236 Local Buckling of Web-Core and Foam-Core Sandwich Panels

Authors: Ali N. Suri, Ahmad A. Al-Makhlufi

Abstract:

Sandwich construction is widely accepted as a method of construction especially in the aircraft industry. It is a type of stressed skin construction formed by bonding two thin faces to a thick core, the faces resist all of the applied edge loads and provide all or nearly all of the required rigidities, the core spaces the faces to increase cross section moment of inertia about common neutral axis and transmit shear between them provides a perfect bond between core and faces is made.

Material for face sheets can be of metal or reinforced plastics laminates, core material can be metallic cores of thin sheets forming corrugation or honeycomb, or non metallic core of Balsa wood, plastic foams, or honeycomb made of reinforced plastics.

For in plane axial loading web core and web-foam core Sandwich panels can fail by local buckling of plates forming the cross section with buckling wave length of the order of length of spacing between webs.

In this study local buckling of web core and web-foam core Sandwich panels is carried out for given materials of facing and core, and given panel overall dimension for different combinations of cross section geometries.

The Finite Strip Method is used for the analysis, and Fortran based computer program is developed and used.

Keywords: Local Buckling, Finite Strip, Sandwich panels, Web and foam core.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2199
235 3D Liver Segmentation from CT Images Using a Level Set Method Based on a Shape and Intensity Distribution Prior

Authors: Nuseiba M. Altarawneh, Suhuai Luo, Brian Regan, Guijin Tang

Abstract:

Liver segmentation from medical images poses more challenges than analogous segmentations of other organs. This contribution introduces a liver segmentation method from a series of computer tomography images. Overall, we present a novel method for segmenting liver by coupling density matching with shape priors. Density matching signifies a tracking method which operates via maximizing the Bhattacharyya similarity measure between the photometric distribution from an estimated image region and a model photometric distribution. Density matching controls the direction of the evolution process and slows down the evolving contour in regions with weak edges. The shape prior improves the robustness of density matching and discourages the evolving contour from exceeding liver’s boundaries at regions with weak boundaries. The model is implemented using a modified distance regularized level set (DRLS) model. The experimental results show that the method achieves a satisfactory result. By comparing with the original DRLS model, it is evident that the proposed model herein is more effective in addressing the over segmentation problem. Finally, we gauge our performance of our model against matrices comprising of accuracy, sensitivity, and specificity.

Keywords: Bhattacharyya distance, distance regularized level set (DRLS) model, liver segmentation, level set method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2304
234 Pre-Service Teachers’ Assessment of Information Technology Application to Instruction

Authors: Adesanya Anuoluwapo Olusola

Abstract:

Technology has moved into the classroom, and it becomes difficult talking of achievement in and attitude to learning without making mention of it. The use of technology makes learning easy, real and practical as it motivates learners, sustains their interest and improves their attitude to learning. This study, therefore examined the pre-service teachers’ assessment of information technology application to instruction. The use of technology emphasizes and encourages active learning in the classroom. The study involved 100 pre-service teachers in the selected two (2) Colleges of Education, Nigeria. Purposive random sampling was used in selecting the participants and ex-post facto design was adopted the in which there is no manipulation of variables. Two valid and reliable instruments were used for data collection: Access Point ICT facilities and Application of ICT. The study established that pre-service teachers have less access to ICT facilities and Application of ICT in the college, apart from those students having the access outside the college. Also fewer pre-service teachers used ICT facilities on weekly and monthly bases. It was concluded that the establishment of students’ resources centres and Campus wide wireless connectivity must be implemented so as to improve and enhance students’ achievement in and attitude to learning. The time and attention devoted to learning activities and strategic specialized ICT skills and requisite entrepreneur skills should be increased so as to have easy access to information sources and be able to apply it in teaching process.

Keywords: Computer, ICT Application, Learning Facilities, Pre-Service Teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
233 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 763
232 Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos

Abstract:

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
231 3D Rendering of American Sign Language Finger-Spelling: A Comparative Study of Two Animation Techniques

Authors: Nicoletta Adamo-Villani

Abstract:

In this paper we report a study aimed at determining the most effective animation technique for representing ASL (American Sign Language) finger-spelling. Specifically, in the study we compare two commonly used 3D computer animation methods (keyframe animation and motion capture) in order to ascertain which technique produces the most 'accurate', 'readable', and 'close to actual signing' (i.e. realistic) rendering of ASL finger-spelling. To accomplish this goal we have developed 20 animated clips of fingerspelled words and we have designed an experiment consisting of a web survey with rating questions. 71 subjects ages 19-45 participated in the study. Results showed that recognition of the words was correlated with the method used to animate the signs. In particular, keyframe technique produced the most accurate representation of the signs (i.e., participants were more likely to identify the words correctly in keyframed sequences rather than in motion captured ones). Further, findings showed that the animation method had an effect on the reported scores for readability and closeness to actual signing; the estimated marginal mean readability and closeness was greater for keyframed signs than for motion captured signs. To our knowledge, this is the first study aimed at measuring and comparing accuracy, readability and realism of ASL animations produced with different techniques.

Keywords: 3D Animation, American Sign Language, DeafEducation, Motion Capture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972
230 Facial Expression Phoenix (FePh): An Annotated Sequenced Dataset for Facial and Emotion-Specified Expressions in Sign Language

Authors: Marie Alaghband, Niloofar Yousefi, Ivan Garibay

Abstract:

Facial expressions are important parts of both gesture and sign language recognition systems. Despite the recent advances in both fields, annotated facial expression datasets in the context of sign language are still scarce resources. In this manuscript, we introduce an annotated sequenced facial expression dataset in the context of sign language, comprising over 3000 facial images extracted from the daily news and weather forecast of the public tv-station PHOENIX. Unlike the majority of currently existing facial expression datasets, FePh provides sequenced semi-blurry facial images with different head poses, orientations, and movements. In addition, in the majority of images, identities are mouthing the words, which makes the data more challenging. To annotate this dataset we consider primary, secondary, and tertiary dyads of seven basic emotions of "sad", "surprise", "fear", "angry", "neutral", "disgust", and "happy". We also considered the "None" class if the image’s facial expression could not be described by any of the aforementioned emotions. Although we provide FePh as a facial expression dataset of signers in sign language, it has a wider application in gesture recognition and Human Computer Interaction (HCI) systems.

Keywords: Annotated Facial Expression Dataset, Sign Language Recognition, Gesture Recognition, Sequenced Facial Expression Dataset.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 679
229 Problems and Prospects of Agricultural Biotechnology in Nigeria’s Developing Economy

Authors: Samson Abayomi Olasoju, Olufemi Adekunle, Titilope Edun, Johnson Owoseni

Abstract:

Science offers opportunities for revolutionizing human activities, enriched by input from scientific research and technology. Biotechnology is a major force for development in developing countries such as Nigeria. It is found to contribute to solving human problems like water and food insecurity that impede national development and threaten peace wherever it is applied. This review identified the problems of agricultural biotechnology in Nigeria. On the part of rural farmers, there is a lack of adequate knowledge or awareness of biotechnology despite the fact that they constitute the bulk of Nigerian farmers. On part of the government, the problems include: lack of adequate implementation of government policy on bio-safety and genetically modified products, inadequate funding of education as well as research and development of products related to biotechnology. Other problems include: inadequate infrastructures (including laboratory), poor funding and lack of national strategies needed for development and running of agricultural biotechnology. In spite of all the challenges associated with agricultural biotechnology, its prospects still remain great if Nigeria is to meet with the food needs of the country’s ever increasing population. The introduction of genetically engineered products will lead to the high productivity needed for commercialization and food security. Insect, virus and other related diseases resistant crops and livestock are another viable area of contribution of biotechnology to agricultural production. In conclusion, agricultural biotechnology will not only ensure food security, but, in addition, will ensure that the local farmers utilize appropriate technology needed for large production, leading to the prosperity of the farmers and national economic growth, provided government plays its role of adequate funding and good policy implementation.

Keywords: Biosafety, biotechnology, food security, genetic engineering, genetic modification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3119
228 Enhanced Disk-Based Databases Towards Improved Hybrid In-Memory Systems

Authors: Samuel Kaspi, Sitalakshmi Venkatraman

Abstract:

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable inmemory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of diskbased database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of inmemory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Keywords: Concurrency control, disk-based databases, inmemory systems, enhanced memory access (EMA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007
227 Enhancing Cache Performance Based on Improved Average Access Time

Authors: Jasim. A. Ghaeb

Abstract:

A high performance computer includes a fast processor and millions bytes of memory. During the data processing, huge amount of information are shuffled between the memory and processor. Because of its small size and its effectiveness speed, cache has become a common feature of high performance computers. Enhancing cache performance proved to be essential in the speed up of cache-based computers. Most enhancement approaches can be classified as either software based or hardware controlled. The performance of the cache is quantified in terms of hit ratio or miss ratio. In this paper, we are optimizing the cache performance based on enhancing the cache hit ratio. The optimum cache performance is obtained by focusing on the cache hardware modification in the way to make a quick rejection to the missed line's tags from the hit-or miss comparison stage, and thus a low hit time for the wanted line in the cache is achieved. In the proposed technique which we called Even- Odd Tabulation (EOT), the cache lines come from the main memory into cache are classified in two types; even line's tags and odd line's tags depending on their Least Significant Bit (LSB). This division is exploited by EOT technique to reject the miss match line's tags in very low time compared to the time spent by the main comparator in the cache, giving an optimum hitting time for the wanted cache line. The high performance of EOT technique against the familiar mapping technique FAM is shown in the simulated results.

Keywords: Caches, Cache performance, Hit time, Cache hit ratio, Cache mapping, Cache memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644
226 Ergonomics and Its Applicability in the Design Process in Egypt Challenges and Prospects

Authors: Mohamed Moheyeldin Mahmoud

Abstract:

Egypt suffers from a severe shortage of data and charts concerning the physical dimensions, measurements, qualities and consumer behavior. The shortage of needed information and appropriate methods has forced the Egyptian designer to use any other foreign standard when designing a product for the Egyptian consumer which has led to many problems. The urgently needed database concerning the physical specifications, measurements of the Egyptian consumers, as well as the need to support the Ergonomics given courses in many colleges and institutes with the latest technologies, is stated as the research problem. Descriptive analytical method relying on the compiling, comparing and analyzing of information and facts in order to get acceptable perceptions, ideas and considerations is the used methodology by the researcher. The research concludes that: 1. Good interaction relationship between users and products shows the success of that product. 2. An integration linkage between the most prominent fields of science specially Ergonomics, Interaction Design and Ethnography should be encouraged to provide an ultimately updated database concerning the nature, specifications and environment of the Egyptian consumer, in order to achieve a higher benefit for both user and product. 3. Chinese economic policy based on the study of market requirements long before any market activities should be emulated. 4. Using Ethnography supports the design activities creating new products or updating existent ones through measuring the compatibility of products with their environment and user expectations, While contracting a joint cooperation between military colleges, sports education institutes from one side, and design institutes from the other side to provide an ultimately updated (annually updated) database concerning some specifications about students of both sexes applying in those institutes (height, weight, etc.) to provide the Industrial designer with the needed information when creating a new product or updating an existing one concerning that category is recommended by the researcher.

Keywords: Adapt ergonomics, ethnography, interaction design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770
225 On Combining Support Vector Machines and Fuzzy K-Means in Vision-based Precision Agriculture

Authors: A. Tellaeche, X. P. Burgos-Artizzu, G. Pajares, A. Ribeiro

Abstract:

One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.

Keywords: Fuzzy k-Means, Precision agriculture, SupportVectors Machines, Weed detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
224 Accurate Visualization of Graphs of Functions of Two Real Variables

Authors: Zeitoun D. G., Thierry Dana-Picard

Abstract:

The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.

Keywords: Function singularities, mesh generation, point allocation, visualization, collocation least squares method, Augmented Lagrangian method, Uzawa's Algorithm, Preconditioned Conjugate Gradien

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
223 Potential of Irish Orientated Strand Board in Bending Active Structures

Authors: M. Collins, B. O’Regan, T. Cosgrove

Abstract:

To determine the potential of a low cost Irish engineered timber product to replace high cost solid timber for use in bending active structures such as gridshells a single Irish engineered timber product in the form of orientated strand board (OSB) was selected. A comparative study of OSB and solid timber was carried out to determine the optimum properties that make a material suitable for use in gridshells. Three parameters were identified to be relevant in the selection of a material for gridshells. These three parameters are the strength to stiffness ratio, the flexural stiffness of commercially available sections, and the variability of material and section properties. It is shown that when comparing OSB against solid timber, OSB is a more suitable material for use in gridshells that are at the smaller end of the scale and that have tight radii of curvature. Typically, for solid timber materials, stiffness is used as an indicator for strength and engineered timber is no different. Thus, low flexural stiffness would mean low flexural strength. However, when it comes to bending active gridshells, OSB offers a significant advantage. By the addition of multiple layers, an increased section size is created, thus endowing the structure with higher stiffness and higher strength from initial low stiffness and low strength materials while still maintaining tight radii of curvature. This allows OSB to compete with solid timber on large scale gridshells. Additionally, a preliminary sustainability study using a set of sustainability indicators was carried out to determine the relative sustainability of building a large-scale gridshell in Ireland with a primary focus on economic viability but a mention is also given to social and environmental aspects. For this, the Savill garden gridshell in the UK was used as the functional unit with the sustainability of the structural roof skeleton constructed from UK larch solid timber being compared with the same structure using Irish OSB. Albeit that the advantages of using commercially available OSB in a bending active gridshell are marginal and limited to specific gridshell applications, further study into an optimised engineered timber product is merited.

Keywords: Bending active gridshells, High end timber structures, Low cost material, Sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
222 Reutilization of Organic and Peat Soils by Deep Cement Mixing

Authors: Bee-Lin Tang, Ismail Bakar, Chee - Ming Chan

Abstract:

Limited infrastructure development on peats and organic soils is a serious geotechnical issues common to many countries of the world especially Malaysia which distributed 1.5 mill ha of those problematic soil. These soils have high water content and organic content which exhibit different mechanical properties and may also change chemically and biologically with time. Constructing structures on peaty ground involves the risk of ground failure and extreme settlement. Nowdays, much efforts need to be done in making peatlands usable for construction due to increased landuse. Deep mixing method employing cement as binders, is generally used as measure again peaty/ organic ground failure problem. Where the technique is widely adopted because it can improved ground considerably in a short period of time. An understanding of geotechnical properties as shear strength, stiffness and compressibility behavior of these soils was requires before continues construction on it. Therefore, 1- 1.5 meter peat soil sample from states of Johor and an organic soil from Melaka, Malaysia were investigated. Cement were added to the soil in the pre-mixing stage with water cement ratio at range 3.5,7,14,140 for peats and 5,10,30 for organic soils, essentially to modify the original soil textures and properties. The mixtures which in slurry form will pour to polyvinyl chloride (pvc) tube and cured at room temperature 250C for 7,14 and 28 days. Laboratory experiments were conducted including unconfined compressive strength and bender element , to monitor the improved strength and stiffness of the 'stabilised mixed soils'. In between, scanning electron miscroscopic (SEM) were observations to investigate changes in microstructures of stabilised soils and to evaluated hardening effect of a peat and organic soils stabilised cement. This preliminary effort indicated that pre-mixing peat and organic soils contributes in gaining soil strength while help the engineers to establish a new method for those problematic ground improvement in further practical and long term applications.

Keywords: peat soils, organic soils, cement stabilisation, strength, stiffness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3229
221 Information and Communication Technologies in Collaboration Projects via the Internet

Authors: Murat Öztok, Nesrin Özdener

Abstract:

The aim of this study is to determine the basic information and communication technology (ICT) skills that may be needed by students studying in the 8th grade of the primary education in their cooperative project works implemented via the Internet. Within the scope of the study, the curriculum used for European Computer Driving License (ECDL) and the curriculum used in Turkey are also compared in terms of the ability to use ICT, which is aimed to be provided to the students. The research population of the study, during which the pre test – post test control group experimental model was used, consisted of 40 students from three different schools. In the first stage of the study, the skills that might be needed by students for their cooperative project works implemented via the Internet were determined through examination of the completed Comenious, e – twinning and WorldLinks projects. In the second stage of the study, the curriculums of the Turkish Ministry of National Education (MEB) and ECDL were evaluated by seven different teachers in line with these skills. Also in this study the ECDL and MEB curriculums were compared in terms of capability to provide the skills to implement cooperative projects via the Internet. In line with the findings of the study, the skills that might be needed by students to implement cooperative projects via the Internet were outlined, and existence of a significant difference was established in favor of the ECDL curriculum upon comparison of both curriculums in accordance with this outline (U = 50,500; p <0,05). The findings of the study also suggested that the students had considerable deficiencies in implementation of cooperative projects via the Internet without the ICT infrastructure.

Keywords: Collaboration Projects, Comenius, Curriculum, ICT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
220 Corporate Governance and Corporate Social Responsibility: Research on the Interconnection of Both Concepts and Its Impact on Non-Profit Organizations

Authors: Helene Eller

Abstract:

The aim of non-profit organizations (NPO) is to provide services and goods for its clientele, with profit being a minor objective. By having this definition as the basic purpose of doing business, it is obvious that the goal of an organisation is to serve several bottom lines and not only the financial one. This approach is underpinned by the non-distribution constraint which means that NPO are allowed to make profits to a certain extent, but not to distribute them. The advantage is that there are no single shareholders who might have an interest in the prosperity of the organisation: there is no pie to divide. The gained profits remain within the organisation and will be reinvested in purposeful projects. Good governance is mandatory to support the aim of NPOs. Looking for a measure of good governance the principals of corporate governance (CG) will come in mind. The purpose of CG is direction and control, and in the field of NPO, CG is enlarged to consider the relationship to all important stakeholders who have an impact on the organisation. The recognition of more relevant parties than the shareholder is the link to corporate social responsibility (CSR). It supports a broader view of the bottom line: It is no longer enough to know how profits are used but rather how they are made. Besides, CSR addresses the responsibility of organisations for their impact on society. When transferring the concept of CSR to the non-profit area it will become obvious that CSR with its distinctive features will match the aims of NPOs. As a consequence, NPOs who apply CG apply also CSR to a certain extent. The research is designed as a comprehensive theoretical and empirical analysis. First, the investigation focuses on the theoretical basis of both concepts. Second, the similarities and differences are outlined and as a result the interconnection of both concepts will show up. The contribution of this research is manifold: The interconnection of both concepts when applied to NPOs has not got any attention in science yet. CSR and governance as integrated concept provides a lot of advantages for NPOs compared to for-profit organisations which are in a steady justification to show the impact they might have on the society. NPOs, however, integrate economic and social aspects as starting point. For NPOs CG is not a mere concept of compliance but rather an enhanced concept integrating a lot of aspects of CSR. There is no “either-nor” between the concepts for NPOs.

Keywords: Business ethics, corporate governance, corporate social responsibility, non-profit organisations, stakeholder theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
219 Influence of Instructors in Engaging Online Graduate Students in Active Learning in the United States

Authors: Ehi E. Aimiuwu

Abstract:

As of 2017, many online learning professionals, institutions, and journals are still wondering how instructors can keep student engaged in the online learning environment to facilitate active learning effectively. The purpose of this qualitative single-case and narrative research is to explore whether online professors understand their role as mentors and facilitators of students’ academic success by keeping students engaged in active learning based on personalized experience in the field. Data collection tools that were used in the study included an NVivo 12 Plus qualitative software, an interview protocol, a digital audiotape, an observation sheet, and a transcription. Seven online professors in the United States from LinkedIn and residencies were interviewed for this study. Eleven online teaching techniques from previous research were used as the study framework. Data analysis process, member checking, and key themes were used to achieve saturation. About 85.7% of professors agreed on rubric as the preferred online grading technique. About 57.1% agreed on professors logging in daily, students logging in about 2-5 times weekly, knowing students to increase accountability, email as preferred communication tool, and computer access for adequate online learning. About 42.9% agreed on syllabus for clear class expectations, participation to show what has been learned, and energizing students for creativity.

Keywords: Class facilitation, class management, online teaching, online education, pedagogy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 643