Search results for: computer node
2072 Analysis of Network Connectivity for Ship-To-Ship Maritime Communication Using IEEE 802.11 on Maritime Environment of Tanjung Perak, Indonesia
Authors: Ahmad Fauzi Makarim, Okkie Puspitorini, Hani'ah Mahmudah, Nur Adi Siswandari, Ari Wijayanti
Abstract:
As a maritime country, Indonesia needs a solution in maritime connectivity which can assist the maritime communication system which including communication from harbor to the ship or ship to ship. The needs of many application services for maritime communication, whether for safety reasons until voyage service to help the process of voyage activity needs connection with a high bandwith. To support the government efforts in handling that kind of problem, a research is conducted in maritime communication issue by applying the new developed technology in Indonesia, namely IEEE 802.11. In this research, 3 outdoor WiFi devices are used in which have a frequency of 5.8 GHz. Maritime of Tanjung Perak harbor in Surabaya until Karang Jamuang Island are used as the location of the research with defining permission of ship node spreading by Navigation District Class 1. That maritime area formed by state 1 and state 2 areas which are the narrow area with average wave height of 0.7 meter based on the data from BMKG S urabaya. After that, wave height used as one of the parameters which are used in analyzing characteristic of signal propagation at sea surface, so it can be determined on the coverage area of transmitter system. In this research has been used three samples of outdoor wifi, there is the coverage of device A can be determined about 2256 meter, device B 4000 meter, and device C 1174 meter. Then to analyze of network connectivity for the ship to ship is used AODV routing algorithm system based on the value of the power transmit was smallest of all nodes within the transmitter coverage.Keywords: maritime of Indonesia, maritime communications, outdoor wifi, coverage, AODV
Procedia PDF Downloads 3502071 FPGA Implementation of Novel Triangular Systolic Array Based Architecture for Determining the Eigenvalues of Matrix
Authors: Soumitr Sanjay Dubey, Shubhajit Roy Chowdhury, Rahul Shrestha
Abstract:
In this paper, we have presented a novel approach of calculating eigenvalues of any matrix for the first time on Field Programmable Gate Array (FPGA) using Triangular Systolic Arra (TSA) architecture. Conventionally, additional computation unit is required in the architecture which is compliant to the algorithm for determining the eigenvalues and this in return enhances the delay and power consumption. However, recently reported works are only dedicated for symmetric matrices or some specific case of matrix. This works presents an architecture to calculate eigenvalues of any matrix based on QR algorithm which is fully implementable on FPGA. For the implementation of QR algorithm we have used TSA architecture, which is further utilising CORDIC (CO-ordinate Rotation DIgital Computer) algorithm, to calculate various trigonometric and arithmetic functions involved in the procedure. The proposed architecture gives an error in the range of 10−4. Power consumption by the design is 0.598W. It can work at the frequency of 900 MHz.Keywords: coordinate rotation digital computer, three angle complex rotation, triangular systolic array, QR algorithm
Procedia PDF Downloads 4152070 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing
Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca de Marchi
Abstract:
This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes available a larger monitored area. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary chars the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.Keywords: data compression, ultrasonic communication, guided waves, FEM analysis
Procedia PDF Downloads 1242069 Simulation of Optimum Sculling Angle for Adaptive Rowing
Authors: Pornthep Rachnavy
Abstract:
The purpose of this paper is twofold. First, we believe that there are a significant relationship between sculling angle and sculling style among adaptive rowing. Second, we introduce a methodology used for adaptive rowing, namely simulation, to identify effectiveness of adaptive rowing. For our study we simulate the arms only single scull of adaptive rowing. The method for rowing fastest under the 1000 meter was investigated by study sculling angle using the simulation modeling. A simulation model of a rowing system was developed using the Matlab software package base on equations of motion consist of many variation for moving the boat such as oars length, blade velocity and sculling style. The boat speed, power and energy consumption on the system were compute. This simulation modeling can predict the force acting on the boat. The optimum sculling angle was performing by computer simulation for compute the solution. Input to the model are sculling style of each rower and sculling angle. Outputs of the model are boat velocity at 1000 meter. The present study suggests that the optimum sculling angle exist depends on sculling styles. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the first style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the second style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the third style is -51.57 and 28.65 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the fourth style is -45.84 and 34.38 degree. A theoretical simulation for rowing has been developed and presented. The results suggest that it may be advantageous for the rowers to select the sculling angles proper to sculling styles. The optimum sculling angles of the rower depends on the sculling styles made by each rower. The investigated of this paper can be concludes in three directions: 1;. There is the optimum sculling angle in arms only single scull of adaptive rowing. 2. The optimum sculling angles depend on the sculling styles. 3. Computer simulation of rowing can identify opportunities for improving rowing performance by utilizing the kinematic description of rowing. The freedom to explore alternatives in speed, thrust and timing with the computer simulation will provide the coach with a tool for systematic assessments of rowing technique In addition, the ability to use the computer to examine the very complex movements during rowing will help both the rower and the coach to conceptualize the components of movements that may have been previously unclear or even undefined.Keywords: simulation, sculling, adaptive, rowing
Procedia PDF Downloads 4652068 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up
Authors: Okhee Woo
Abstract:
Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose
Procedia PDF Downloads 1972067 Digital Musical Organology: The Audio Games: The Question of “A-Musicological” Interfaces
Authors: Hervé Zénouda
Abstract:
This article seeks to shed light on an emerging creative field: "Audio games," at the crossroads between video games and computer music. Indeed, many applications, which propose entertaining audio-visual experiences with the objective of musical creation, are available today for different supports (game consoles, computers, cell phones). The originality of this field is the use of the gameplay of video games applied to music composition. Thus, composing music using interfaces but also cognitive logics that we qualify as "a-musicological" seem to us particularly interesting from the perspective of musical digital organology. This field raises questions about the representation of sound and musical structures and develops new instrumental gestures and strategies of musical composition. We will try in this article to define the characteristics of this field by highlighting some historical milestones (abstract cinema, game theory in music, actions, and graphic scores) as well as the novelties brought by digital technologies.Keywords: audio-games, video games, computer generated music, gameplay, interactivity, synesthesia, sound interfaces, relationships image/sound, audiovisual music
Procedia PDF Downloads 1122066 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings
Authors: Jude K. Safo
Abstract:
Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics
Procedia PDF Downloads 682065 Examining Relationship between Programming Performance, Programming Self Efficacy and Math Success
Authors: Mustafa Ekici, Sacide Güzin Mazman
Abstract:
Programming is the one of ability in computer science fields which is generally perceived difficult by students and various individual differences have been implicated in that ability success. Although several factors that affect programming ability have been identified over the years, there is not still a full understanding of why some students learn to program easily and quickly while others find it complex and difficult. Programming self-efficacy and mathematic success are two of those essential individual differences which are handled as having important effect on the programming success. This study aimed to identify the relationship between programming performance, programming self efficacy and mathematics success. The study group is consisted of 96 undergraduates from Department of Econometrics of Uşak University. 38 (39,58%) of the participants are female while 58 (60,41%) of them are male. Study was conducted in the programming-I course during 2014-2015 fall term. Data collection tools are comprised of programming course final grades, programming self efficacy scale and a mathematics achievement test. Data was analyzed through correlation analysis. The result of study will be reported in the full text of the study.Keywords: programming performance, self efficacy, mathematic success, computer science
Procedia PDF Downloads 5022064 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 822063 Computer Simulation Studies of Spinel LiMn₂O₄ Nanotubes
Authors: D. M. Tshwane, R. R. Maphanga, P. E. Ngoepe
Abstract:
Nanostructured materials are attractive candidates for efficient electrochemical energy storage devices because of their unique physicochemical properties. Nanotubes have drawn a continuous attention because of their unique electrical, optical and magnetic properties contrast to that of bulk system. They have potential application in the field of optical, electronics and energy storage device. Introducing nanotubes structures as electrode materials; represents one of the most attractive strategies that could dramatically enhance the battery performance. Spinel LiMn2O4 is the most promising cathode material for Li-ion batteries. In this work, computer simulation methods are used to generate and investigate properties of spinel LiMn2O4 nanotubes. Molecular dynamic simulation is used to probe the local structure of LiMn2O4 nanotubes and the effect of temperature on these systems. It is found that diameter, Miller indices and size have a direct control on nanotubes morphology. Furthermore, it is noted that stability depends on surface and wrapping of the nanotube. The nanotube structures are described using the radial distribution function and XRD patterns. There is a correlation between calculated XRD and experimentally reported results.Keywords: LiMn2O4, li-ion batteries, nanotubes, nanostructures
Procedia PDF Downloads 1892062 Deployment of Information and Communication Technology (ICT) to Reduce Occurrences of Terrorism in Nigeria
Authors: Okike Benjamin
Abstract:
Terrorism is the use of violence and threat to intimidate or coerce a person, group, society or even government especially for political purposes. Terrorism may be a way of resisting government by some group who may feel marginalized. It could also be a way of expressing displeasure over the activities of government. On 26th December, 2009, US placed Nigeria as a terrorist nation. Recently, the occurrences of terrorism in Nigeria have increased considerably. In Jos, Plateau state, Nigeria, there was a bomb blast which claimed many lives on the eve of 2010 Christmas. Similarly, there was another bomb blast in Mugadishi (Sani Abacha) Barracks Mammy market on the eve of 2011 New Year. For some time now, it is no longer news that bomb exploded in some Northern part of Nigeria. About 25 years ago, stopping terrorism in America by the Americans relied on old-fashioned tools such as strict physical security at vulnerable places, intelligence gathering by government agents, or individuals, vigilance on the part of all citizens, and a sense of community in which citizens do what could be done to protect each other. Just as technology has virtually been used to better the way many other things are done, so also this powerful new weapon called computer technology can be used to detect and prevent terrorism not only in Nigeria, but all over the world. This paper will x-ray the possible causes and effects of bomb blast, which is an act of terrorism and suggest ways in which Explosive Detection Devices (EDDs) and computer software technology could be deployed to reduce the occurrences of terrorism in Nigeria. This become necessary with the abduction of over 200 schoolgirls in Chibok, Borno State from their hostel by members of Boko Haram sect members on 14th April, 2014. Presently, Barrack Obama and other world leaders have sent some of their military personnel to help rescue those innocent schoolgirls whose offence is simply seeking to acquire western education which the sect strongly believe is forbidden.Keywords: terrorism, bomb blast, computer technology, explosive detection devices, Nigeria
Procedia PDF Downloads 2612061 Digital Literacy Skills for Geologist in Public Sector
Authors: Angsumalin Puntho
Abstract:
Disruptive technology has had a great influence on our everyday lives and the existence of an organization. Geologists in the public sector need to keep up with digital technology and be able to work and collaborate in a more effective manner. The result from SWOT and 7S McKinsey analyses suggest that there are inadequate IT personnel, no individual digital literacy development plan, and a misunderstanding of management policies. The Office of Civil Service Commission develops digital literacy skills that civil servants and government officers should possess in order to work effectively; it consists of nine dimensions, including computer skills, internet skills, cyber security awareness, word processing, spreadsheets, presentation programs, online collaboration, graphics editors and cyber security practices; and six steps of digital literacy development including self-assessment, individual development plan, self-learning, certified test, learning reflection, and practices. Geologists can use digital literacy as a learning tool to develop themselves for better career opportunities.Keywords: disruptive technology, digital technology, digital literacy, computer skills
Procedia PDF Downloads 1162060 Influence of Salicylic Acid on Submergence Stress Recovery in Selected Rice Cultivars (Oryza sativa L.)
Authors: Ja’afar U., A. M. Gumi, Salisu N., Obadiah C. D.
Abstract:
Rice is susceptible to flooding due to its semi-aquatic characteristics, which enable it to thrive in wet or submerged environments. The development of aerenchyma allows for oxygen transfer, enabling faster lengthening of submerged shoot organs. Rice's germination and early seedling growth phases are highly intolerant of submersion, resulting in survival in low-oxygen environments. The research involved a study on rice plants treated with salicylic acid at different concentrations. Hypo was used for washing, while a reagent was used for submergence treatment. The plants were waterlogged for 11 days and submerged for 7 days, with control plants receiving distilled water. The study found a significant difference between Jirani Zawara's control and treated plants, with plants treated with 2 g/L of S.A. showing a 6.00 node increase per plant and Faro cultivars having more nodes. The study found significant differences between the control and treated plants, with the Jirani Zawara plant showing longer internode lengths and the Faro cultivar having longer internode lengths, while the B.G. cultivar had the longest. The research found that the Jirani Zawara cultivar treated with 3 g/L of S.A. produced tallest plants, with heights increasing from 14.43 cm to 15.50 cm in Faro cultivar S.A., and the highest height was 16.30 cm. The study reveals that salicylic acid significantly enhances the number of nodes, internode length, plant height, and root length in rice cultivars, thereby improving submerged stress recovery and promoting plant development.Keywords: rice, submergence, stress, salicylic acid
Procedia PDF Downloads 142059 Significance of Personnel Recruitment in Implementation of Computer Aided Design Curriculum of Architecture Schools
Authors: Kelechi E. Ezeji
Abstract:
The inclusion of relevant content in curricula of architecture schools is vital for attainment of Computer Aided Design (CAD) proficiency by graduates. Implementing this content involves, among other variables, the presence of competent tutors. Consequently, this study sought to investigate the importance of personnel recruitment for inclusion of content vital to the implementation of CAD in the curriculum for architecture education. This was with a view to developing a framework for appropriate implementation of CAD curriculum. It was focused on departments of architecture in universities in south-east Nigeria which have been accredited by National Universities Commission. Survey research design was employed. Data were obtained from sources within the study area using questionnaires, personal interviews, physical observation/enumeration and examination of institutional documents. A multi-stage stratified random sampling method was adopted. The first stage of stratification involved random sampling by balloting of the departments. The second stage involved obtaining respondents’ population from the number of staff and students of sample population. Chi Square analysis tool for nominal variables and Pearson’s product moment correlation test for interval variables were used for data analysis. With ρ < 0.5, the study found significant correlation between the number of CAD literate academic staff and use of CAD in design studio/assignments; that increase in the overall number of teaching staff significantly affected total CAD credit units in the curriculum of the department. The implications of these findings were that for successful implementation leading to attainment of CAD proficiency to occur, CAD-literacy should be a factor in the recruitment of staff and a policy of in-house training should be pursued.Keywords: computer-aided design, education, personnel recruitment, curriculum
Procedia PDF Downloads 2102058 Over the Air Programming Method for Learning Wireless Sensor Networks
Authors: K. Sangeeth, P. Rekha, P. Preeja, P. Divya, R. Arya, R. Maneesha
Abstract:
Wireless sensor networks (WSN) are small or tiny devices that consists of different sensors to sense physical parameters like air pressure, temperature, vibrations, movement etc., process these data and sends it to the central data center to take decisions. The WSN domain, has wide range of applications such as monitoring and detecting natural hazards like landslides, forest fire, avalanche, flood monitoring and also in healthcare applications. With such different applications, it is being taught in undergraduate/post graduate level in many universities under department of computer science. But the cost and infrastructure required to purchase WSN nodes for having the students getting hands on expertise on these devices is expensive. This paper gives overview about the remote triggered lab that consists of more than 100 WSN nodes that helps the students to remotely login from anywhere in the world using the World Wide Web, configure the nodes and learn the WSN concepts in intuitive way. It proposes new way called over the air programming (OTAP) and its internals that program the 100 nodes simultaneously and view the results without the nodes being physical connected to the computer system, thereby allowing for sparse deployment.Keywords: WSN, over the air programming, virtual lab, AT45DB
Procedia PDF Downloads 3772057 Modeling Loads Applied to Main and Crank Bearings in the Compression-Ignition Two-Stroke Engine
Authors: Marcin Szlachetka, Mateusz Paszko, Grzegorz Baranski
Abstract:
This paper discusses the AVL EXCITE Designer simulation research into loads applied to main and crank bearings in the compression-ignition two-stroke engine. There was created a model of engine lubrication system which covers the part of this system related to particular nodes of a bearing system, i.e. a connection of main bearings in an engine block with a crankshaft, a connection of crank pins with a connecting rod. The analysis focused on the load given as a distribution of hydrodynamic oil film pressure corresponding different values of radial internal clearance. There was also studied the impact of gas force on minimal oil film thickness in main and crank bearings versus crankshaft rotational speed. Our model calculates oil film parameters, an oil film pressure distribution, an oil temperature change and dimensions of bearings as well as an oil temperature distribution on surfaces of bearing seats. Accordingly, it was possible to select, for example, a correct clearance for each of the node bearings. The research was performed for several values of engine crankshaft speed ranging from 800 RPM to 4000 RPM. Bearing oil pressure was changed according to engine speed ranging between 1 bar and 5 bar and an oil temperature of 90°C. The main bearing clearances made initially for the calculation and research were: 0.015 mm, 0.025 mm, 0.035 mm, 0.05 mm, 0.1 mm. The oil used for the research corresponded the SAE 5W-40 classification. The paper presents the selected research results referring to certain specific operating points and bearing radial internal clearances. Acknowledgement: This work has been realized in the cooperation with The Construction Office of WSK ‘PZL-KALISZ’ S.A. and is part of Grant Agreement No. POIR.01.02.00-00-0002/15 financed by the Polish National Centre for Research and Development.Keywords: crank bearings, diesel engine, oil film, two-stroke engine
Procedia PDF Downloads 2142056 Efficient Reconstruction of DNA Distance Matrices Using an Inverse Problem Approach
Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii
Abstract:
We continue to consider one of the cybernetic methods in computational biology related to the study of DNA chains. Namely, we are considering the problem of reconstructing the not fully filled distance matrix of DNA chains. When applied in a programming context, it is revealed that with a modern computer of average capabilities, creating even a small-sized distance matrix for mitochondrial DNA sequences is quite time-consuming with standard algorithms. As the size of the matrix grows larger, the computational effort required increases significantly, potentially spanning several weeks to months of non-stop computer processing. Hence, calculating the distance matrix on conventional computers is hardly feasible, and supercomputers are usually not available. Therefore, we started publishing our variants of the algorithms for calculating the distance between two DNA chains; then, we published algorithms for restoring partially filled matrices, i.e., the inverse problem of matrix processing. In this paper, we propose an algorithm for restoring the distance matrix for DNA chains, and the primary focus is on enhancing the algorithms that shape the greedy function within the branches and boundaries method framework.Keywords: DNA chains, distance matrix, optimization problem, restoring algorithm, greedy algorithm, heuristics
Procedia PDF Downloads 1182055 Adjustment and Compensation Techniques for the Rotary Axes of Five-axis CNC Machine Tools
Authors: Tung-Hui Hsu, Wen-Yuh Jywe
Abstract:
Five-axis computer numerical control (CNC) machine tools (three linear and two rotary axes) are ideally suited to the fabrication of complex work pieces, such as dies, turbo blades, and cams. The locations of the axis average line and centerline of the rotary axes strongly influence the performance of these machines; however, techniques to compensate for eccentric error in the rotary axes remain weak. This paper proposes optical (Non-Bar) techniques capable of calibrating five-axis CNC machine tools and compensating for eccentric error in the rotary axes. This approach employs the measurement path in ISO/CD 10791-6 to determine the eccentric error in two rotary axes, for which compensatory measures can be implemented. Experimental results demonstrate that the proposed techniques can improve the performance of various five-axis CNC machine tools by more than 90%. Finally, a result of the cutting test using a B-type five-axis CNC machine tool confirmed to the usefulness of this proposed compensation technique.Keywords: calibration, compensation, rotary axis, five-axis computer numerical control (CNC) machine tools, eccentric error, optical calibration system, ISO/CD 10791-6
Procedia PDF Downloads 3832054 Trainees' Perception of Virtual Learning Skills in Setting up the Simulator Welding Technology
Authors: Mohd Afif Md Nasir, Mohd Faizal Amin Nur, Jamaluddin Hasim, Abd Samad Hasan Basari, Mohd Halim Sahelan
Abstract:
This study is aimed to investigate the suitability of Computer-Based Training (CBT) as one of the approaches in skills competency development at the Centre of Instructor and Advanced Skills Training (CIAST) Shah Alam Selangor and National Youth Skills Institute (NYSI) Pagoh Muar Johor. This study has also examined the perception among trainees toward Virtual Learning Environment (VLE) as to realize the development of skills in Welding Technology. The significance of the study is to create a computer-based skills development approach in welding technology among new trainees in CIAST and IKBN as well as to cultivate the element of general skills among them. This study is also important in elevating the number of individual knowledge workers (K-Workers) working in manufacturing industry in order to achieve the national vision which is to be an industrial nation in the year 2020. The design is a survey of research which using questionnaires as the instruments and is conducted towards 136 trainees from CIAST and IKBN. Data from the questionnaires is proceeding in a Statistical Package for Social Science (SPSS) in order to find the frequency, mean and chi-square testing. The findings of the study show the welding technology skills have developed in the trainees as a result of the application of the Virtual Reality simulator at a high level (mean=3.90) and the respondents agreed the skills could be embedded through the application of the Virtual Reality simulator (78.01%). The Study also found that there is a significant difference between trainee skill characteristics through the application of the Virtual Reality simulator (p<0.05). Thereby, the Virtual Reality simulator is suitable to be used in the development of welding skills among trainees through the skills training institute.Keywords: computer-based training, virtual learning environment, welding technology, virtual reality simulator, virtual learning environment
Procedia PDF Downloads 4262053 Initial Dip: An Early Indicator of Neural Activity in Functional Near Infrared Spectroscopy Waveform
Authors: Mannan Malik Muhammad Naeem, Jeong Myung Yung
Abstract:
Functional near infrared spectroscopy (fNIRS) has a favorable position in non-invasive brain imaging techniques. The concentration change of oxygenated hemoglobin and de-oxygenated hemoglobin during particular cognitive activity is the basis for this neuro-imaging modality. Two wavelengths of near-infrared light can be used with modified Beer-Lambert law to explain the indirect status of neuronal activity inside brain. The temporal resolution of fNIRS is very good for real-time brain computer-interface applications. The portability, low cost and an acceptable temporal resolution of fNIRS put it on a better position in neuro-imaging modalities. In this study, an optimization model for impulse response function has been used to estimate/predict initial dip using fNIRS data. In addition, the activity strength parameter related to motor based cognitive task has been analyzed. We found an initial dip that remains around 200-300 millisecond and better localize neural activity.Keywords: fNIRS, brain-computer interface, optimization algorithm, adaptive signal processing
Procedia PDF Downloads 2262052 Assessment of E-Learning Facilities in Open and Distance Learning and Information Need by Students
Authors: Sabo Elizabeth
Abstract:
Electronic learning is increasingly popular learning approach in higher educational institutions due to vast growth of internet technology. This approach is important in human capital development. An investigation of open distance and e-learning facilities and information need by open and distance learning students was carried out in Jalingo, Nigeria. Structured questionnaires were administered to 70 registered ODL students of the NOUN. Information sourced from the respondents covered demographic, economic and institutional variables. Data collected for demographic variables were computed as frequency count and percentages. Assessment of the effectiveness of ODL facilities and information need among open and distance learning students was computed on a three or four point Likert Rating Scale. Findings indicated that there are more men compared to women. A large proportion of the respondents are married and there are more matured students in ODL compared to the youth. A high proportion of the ODL students obtained qualifications higher than the secondary school certificate. The proportion of computer literate ODL students was high, and large number of the students does not own a laptop computer. Inadequate e -books and reference materials, internet gadgets and inadequate books (hard copies) and reference material are factors that limit utilization of e-learning facilities in the study areas. Inadequate computer facilities and power back up caused inconveniences and delay in administering and use of e learning facilities. To a high extent, open and distance learning students needed information on university time table and schedule of activities, availability and access to books (hard and e-books) and reference materials. The respondents emphasized that contact with course coordinators via internet will provide a better learning and academic performance.Keywords: open and distance learning, information required, electronic books, internet gadgets, Likert scale test
Procedia PDF Downloads 3252051 Blended Learning in a Mathematics Classroom: A Focus in Khan Academy
Authors: Sibawu Witness Siyepu
Abstract:
This study explores the effects of instructional design using blended learning in the learning of radian measures among Engineering students. Blended learning is an education programme that combines online digital media with traditional classroom methods. It requires the physical presence of both lecturer and student in a mathematics computer laboratory. Blended learning provides element of class control over time, place, path or pace. The focus was on the use of Khan Academy to supplement traditional classroom interactions. Khan Academy is a non-profit educational organisation created by educator Salman Khan with a goal of creating an accessible place for students to learn through watching videos in a computer assisted computer. The researcher who is an also lecturer in mathematics support programme collected data through instructing students to watch Khan Academy videos on radian measures, and by supplying students with traditional classroom activities. Classroom activities entails radian measure activities extracted from the Internet. Students were given an opportunity to engage in class discussions, social interactions and collaborations. These activities necessitated students to write formative assessments tests. The purpose of formative assessments tests was to find out about the students’ understanding of radian measures, including errors and misconceptions they displayed in their calculations. Identification of errors and misconceptions serve as pointers of students’ weaknesses and strengths in their learning of radian measures. At the end of data collection, semi-structure interviews were administered to a purposefully sampled group to explore their perceptions and feedback regarding the use of blended learning approach in teaching and learning of radian measures. The study employed Algebraic Insight Framework to analyse data collected. Algebraic Insight Framework is a subset of symbol sense which allows a student to correctly enter expressions into a computer assisted systems efficiently. This study offers students opportunities to enter topics and subtopics on radian measures into a computer through the lens of Khan Academy. Khan academy demonstrates procedures followed to reach solutions of mathematical problems. The researcher performed the task of explaining mathematical concepts and facilitated the process of reinvention of rules and formulae in the learning of radian measures. Lastly, activities that reinforce students’ understanding of radian were distributed. Results showed that this study enthused the students in their learning of radian measures. Learning through videos prompted the students to ask questions which brought about clarity and sense making to the classroom discussions. Data revealed that sense making through reinvention of rules and formulae assisted the students in enhancing their learning of radian measures. This study recommends the use of Khan Academy in blended learning to be introduced as a socialisation programme to all first year students. This will prepare students that are computer illiterate to become conversant with the use of Khan Academy as a powerful tool in the learning of mathematics. Khan Academy is a key technological tool that is pivotal for the development of students’ autonomy in the learning of mathematics and that promotes collaboration with lecturers and peers.Keywords: algebraic insight framework, blended learning, Khan Academy, radian measures
Procedia PDF Downloads 3092050 The importance of Clinical Pharmacy and Computer Aided Drug Design
Authors: Peter Edwar Mortada Nasif
Abstract:
The use of CAD (Computer Aided Design) technology is ubiquitous in the architecture, engineering and construction (AEC) industry. This has led to its inclusion in the curriculum of architecture schools in Nigeria as an important part of the training module. This article examines the ethical issues involved in implementing CAD (Computer Aided Design) content into the architectural education curriculum. Using existing literature, this study begins with the benefits of integrating CAD into architectural education and the responsibilities of different stakeholders in the implementation process. It also examines issues related to the negative use of information technology and the perceived negative impact of CAD use on design creativity. Using a survey method, data from the architecture department of Chukwuemeka Odumegwu Ojukwu Uli University was collected to serve as a case study on how the issues raised were being addressed. The article draws conclusions on what ensures successful ethical implementation. Millions of people around the world suffer from hepatitis C, one of the world's deadliest diseases. Interferon (IFN) is treatment options for patients with hepatitis C, but these treatments have their side effects. Our research focused on developing an oral small molecule drug that targets hepatitis C virus (HCV) proteins and has fewer side effects. Our current study aims to develop a drug based on a small molecule antiviral drug specific for the hepatitis C virus (HCV). Drug development using laboratory experiments is not only expensive, but also time-consuming to conduct these experiments. Instead, in this in silicon study, we used computational techniques to propose a specific antiviral drug for the protein domains of found in the hepatitis C virus. This study used homology modeling and abs initio modeling to generate the 3D structure of the proteins, then identifying pockets in the proteins. Acceptable lagans for pocket drugs have been developed using the de novo drug design method. Pocket geometry is taken into account when designing ligands. Among the various lagans generated, a new specific for each of the HCV protein domains has been proposed.Keywords: drug design, anti-viral drug, in-silicon drug design, hepatitis C virus, computer aided design, CAD education, education improvement, small-size contractor automatic pharmacy, PLC, control system, management system, communication
Procedia PDF Downloads 222049 Manifestations of Tuberculosis in Otorhinolaryngology Practice: A Retrospective Study Conducted in a Coastal City of South India
Authors: Rithika Sriram, Kiran M. Bhojwani
Abstract:
Introduction : Tuberculosis of the head and neck has proved to be a diagnostic challenge for otorhinolarynologists around the world. These lesions are often misdiagnosed as cancer. So in order to contribute to a better understanding of these lesions, we have conducted our study among patients affected by TB in the head and neck region with the objective of assessing the various manifestations, presentations, diagnostic techniques, risk factors such as smoking and alcohol consumption, coexisting illnesses and treatment modalities. Materials and Methods: This was a retrospective study conducted over a three year period (2012-2014) in 2 hospitals affliated to Kasturba Medical College in Mangalore, South India. A semi structured proforma was used to capture information from the medical records pertaining to the various objectives of the study such as clinical features and history of smoking. Data was analysed using SPSS version 16.0 and results obtained were depicted as percentages. Chi square test was used to find association between the variables and p<0.05 was considered statistically significant. Results: 104 patients were found to have TB of the head and neck and among them,the most common manifestation was found to be Tubercular Lymphadenitis (86.53%), followed by laryngeal TB (4.8%), submandibular gland TB (3.8%), deep neck space abscess(3.8%) and adenotonsillar TB. FNAC was found to be the gold standard for the diagnosis of TB disease of the lymph node.26% of the patients had coexisting HIV infection and 16.3% of the patients had associated pulmonary TB. More than 20% of the patients were smokers. Most patients were treated using ATT. Conclusion: Tuberculosis affecting regions of head and neck is no longer uncommon. Sufficient knowledge and appropriate diagnostic means is required while dealing with these lesions and must be included in the differential diagnosis of pathological lesions of head and neck.Keywords: FNAC, Mangalore, smoking, tuberculosis
Procedia PDF Downloads 2782048 O-LEACH: The Problem of Orphan Nodes in the LEACH of Routing Protocol for Wireless Sensor Networks
Authors: Wassim Jerbi, Abderrahmen Guermazi, Hafedh Trabelsi
Abstract:
The optimum use of coverage in wireless sensor networks (WSNs) is very important. LEACH protocol called Low Energy Adaptive Clustering Hierarchy, presents a hierarchical clustering algorithm for wireless sensor networks. LEACH is a protocol that allows the formation of distributed cluster. In each cluster, LEACH randomly selects some sensor nodes called cluster heads (CHs). The selection of CHs is made with a probabilistic calculation. It is supposed that each non-CH node joins a cluster and becomes a cluster member. Nevertheless, some CHs can be concentrated in a specific part of the network. Thus, several sensor nodes cannot reach any CH. to solve this problem. We created an O-LEACH Orphan nodes protocol, its role is to reduce the sensor nodes which do not belong the cluster. The cluster member called Gateway receives messages from neighboring orphan nodes. The gateway informs CH having the neighboring nodes that not belong to any group. However, Gateway called (CH') attaches the orphaned nodes to the cluster and then collected the data. O-Leach enables the formation of a new method of cluster, leads to a long life and minimal energy consumption. Orphan nodes possess enough energy and seeks to be covered by the network. The principal novel contribution of the proposed work is O-LEACH protocol which provides coverage of the whole network with a minimum number of orphaned nodes and has a very high connectivity rates.As a result, the WSN application receives data from the entire network including orphan nodes. The proper functioning of the Application requires, therefore, management of intelligent resources present within each the network sensor. The simulation results show that O-LEACH performs better than LEACH in terms of coverage, connectivity rate, energy and scalability.Keywords: WSNs; routing; LEACH; O-LEACH; Orphan nodes; sub-cluster; gateway; CH’
Procedia PDF Downloads 3712047 “Presently”: A Personal Trainer App to Self-Train and Improve Presentation Skills
Authors: Shyam Mehraaj, Samanthi E. R. Siriwardana, Shehara A. K. G. H., Wanigasinghe N. T., Wandana R. A. K., Wedage C. V.
Abstract:
A presentation is a critical tool for conveying not just spoken information but also a wide spectrum of human emotions. The single most effective thing to make the presentation successful is to practice it beforehand. Preparing for a presentation has been shown to be essential for improving emotional control, intonation and prosody, pronunciation, and vocabulary, as well as the quality of the presentation slides. As a result, practicing has become one of the most critical parts of giving a good presentation. In this research, the main focus is to analyze the audio, video, and slides of the presentation uploaded by the presenters. This proposed solution is based on the Natural Language Processing and Computer Vision techniques to cater to the requirement for the presenter to do a presentation beforehand using a mobile responsive web application. The proposed system will assist in practicing the presentation beforehand by identifying the presenters’ emotions, body language, tonality, prosody, pronunciations and vocabulary, and presentation slides quality. Overall, the system will give a rating and feedback to the presenter about the performance so that the presenters’ can improve their presentation skills.Keywords: presentation, self-evaluation, natural learning processing, computer vision
Procedia PDF Downloads 1182046 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform
Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba
Abstract:
Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision
Procedia PDF Downloads 4782045 Improving Digital Data Security Awareness among Teacher Candidates with Digital Storytelling Technique
Authors: Veysel Çelik, Aynur Aker, Ebru Güç
Abstract:
Developments in information and communication technologies have increased both the speed of producing information and the speed of accessing new information. Accordingly, the daily lives of individuals have started to change. New concepts such as e-mail, e-government, e-school, e-signature have emerged. For this reason, prospective teachers who will be future teachers or school administrators are expected to have a high awareness of digital data security. The aim of this study is to reveal the effect of the digital storytelling technique on the data security awareness of pre-service teachers of computer and instructional technology education departments. For this purpose, participants were selected based on the principle of volunteering among third-grade students studying at the Computer and Instructional Technologies Department of the Faculty of Education at Siirt University. In the research, the pretest/posttest half experimental research model, one of the experimental research models, was used. In this framework, a 6-week lesson plan on digital data security awareness was prepared in accordance with the digital narration technique. Students in the experimental group formed groups of 3-6 people among themselves. The groups were asked to prepare short videos or animations for digital data security awareness. The completed videos were watched and evaluated together with prospective teachers during the evaluation process, which lasted approximately 2 hours. In the research, both quantitative and qualitative data collection tools were used by using the digital data security awareness scale and the semi-structured interview form consisting of open-ended questions developed by the researchers. According to the data obtained, it was seen that the digital storytelling technique was effective in creating data security awareness and creating permanent behavior changes for computer and instructional technology students.Keywords: digital storytelling, self-regulation, digital data security, teacher candidates, self-efficacy
Procedia PDF Downloads 1262044 Challenges of Online Education and Emerging E-Learning Technologies in Nigerian Tertiary Institutions Using Adeyemi College of Education as a Case Study
Authors: Oluwatofunmi Otobo
Abstract:
This paper presents a review of the challenges of e-learning and e-learning technologies in tertiary institutions. This review is based on the researchers observations of the challenges of making use of ICT for learning in Nigeria using Adeyemi College of Education as a case study; this is in comparison to tertiary institutions in the UK, US and other more developed countries. In Nigeria and probably Africa as a whole, power is the major challenge. Its inconsistency and fluctuations pose the greatest challenge to making use of online education inside and outside the classroom. Internet and its supporting infrastructures in many places in Nigeria are slow and unreliable. This, in turn, could frustrate any attempt at making use of online education and e-learning technologies. Lack of basic knowledge of computer, its technologies and facilities could also prove to be a challenge as many young people up until now are yet to be computer literate. Personal interest on both the parts of lecturers and students is also a challenge. Many people are not interested in learning how to make use of technologies. This makes them resistant to changing from the ancient methods of doing things. These and others were reviewed by this paper, suggestions, and recommendations were proffered.Keywords: education, e-learning, Nigeria, tertiary institutions
Procedia PDF Downloads 1982043 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 91