Search results for: computer node
2612 Evaluation of the Self-Efficacy and Learning Experiences of Final year Students of Computer Science of Southwest Nigerian Universities
Authors: Olabamiji J. Onifade, Peter O. Ajayi, Paul O. Jegede
Abstract:
This study aimed at investigating the preparedness of the undergraduate final year students of Computer Science as the next entrants into the workplace. It assessed their self-efficacy in computational tasks and examined the relationship between their self-efficacy and their learning experiences in Southwest Nigerian universities. The study employed a descriptive survey research design. The population of the study comprises all the final year students of Computer Science. A purposive sampling technique was adopted in selecting a representative sample of interest from the final year students of Computer Science. The Students’ Computational Task Self-Efficacy Questionnaire (SCTSEQ) was used to collect data. Mean, standard deviation, frequency, percentages, and linear regression were used for data analysis. The result obtained revealed that the final year students of Computer Science were averagely confident in performing computational tasks, and there is a significant relationship between the learning experiences of the students and their self-efficacy. The study recommends that the curriculum be improved upon to accommodate industry experts as lecturers in some of the courses, make provision for more practical sessions, and the learning experiences of the student be considered an important component in the undergraduate Computer Science curriculum development process.Keywords: computer science, learning experiences, self-efficacy, students
Procedia PDF Downloads 1432611 A Cloud-Based Federated Identity Management in Europe
Authors: Jesus Carretero, Mario Vasile, Guillermo Izquierdo, Javier Garcia-Blas
Abstract:
Currently, there is a so called ‘identity crisis’ in cybersecurity caused by the substantial security, privacy and usability shortcomings encountered in existing systems for identity management. Federated Identity Management (FIM) could be solution for this crisis, as it is a method that facilitates management of identity processes and policies among collaborating entities without enforcing a global consistency, that is difficult to achieve when there are ID legacy systems. To cope with this problem, the Connecting Europe Facility (CEF) initiative proposed in 2014 a federated solution in anticipation of the adoption of the Regulation (EU) N°910/2014, the so-called eIDAS Regulation. At present, a network of eIDAS Nodes is being deployed at European level to allow that every citizen recognized by a member state is to be recognized within the trust network at European level, enabling the consumption of services in other member states that, until now were not allowed, or whose concession was tedious. This is a very ambitious approach, since it tends to enable cross-border authentication of Member States citizens without the need to unify the authentication method (eID Scheme) of the member state in question. However, this federation is currently managed by member states and it is initially applied only to citizens and public organizations. The goal of this paper is to present the results of a European Project, named eID@Cloud, that focuses on the integration of eID in 5 cloud platforms belonging to authentication service providers of different EU Member States to act as Service Providers (SP) for private entities. We propose an initiative based on a private eID Scheme both for natural and legal persons. The methodology followed in the eID@Cloud project is that each Identity Provider (IdP) is subscribed to an eIDAS Node Connector, requesting for authentication, that is subscribed to an eIDAS Node Proxy Service, issuing authentication assertions. To cope with high loads, load balancing is supported in the eIDAS Node. The eID@Cloud project is still going on, but we already have some important outcomes. First, we have deployed the federation identity nodes and tested it from the security and performance point of view. The pilot prototype has shown the feasibility of deploying this kind of systems, ensuring good performance due to the replication of the eIDAS nodes and the load balance mechanism. Second, our solution avoids the propagation of identity data out of the native domain of the user or entity being identified, which avoids problems well known in cybersecurity due to network interception, man in the middle attack, etc. Last, but not least, this system allows to connect any country or collectivity easily, providing incremental development of the network and avoiding difficult political negotiations to agree on a single authentication format (which would be a major stopper).Keywords: cybersecurity, identity federation, trust, user authentication
Procedia PDF Downloads 1662610 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 272609 Stimulating the Social Interaction Development of Children through Computer Play Activities: The Role of Teachers
Authors: Mahani Razali, Abd Halim Masnan, Nordin Mamat, Seah Siok Peh
Abstract:
This research is based on three main objectives which are to identify children`s social interaction behaviour during computer play activities, teacher’s role and to explore teacher’s beliefs, views and knowledge about computers use in four Malaysian pre-schools.This qualitative study was carried out among 25 pre-school children and three teachers as the research sample. The data collection procedures involved structured observation which was to identify social interaction behavior among pre-school children through computer play activities; as for semi-structured interviews, it was done to study the perception of the teachers on the acquired of social interaction behavior development among the children. A variety of patterns can be seen within the peer interactions indicating that children exhibit a vast range of social interactions at the computer, and they varied each day. The findings of this study guide us to certain conclusions, which have implications in understanding the phenomena of how computers were used and how its relationship to the children’s social interactions emerge in the four Malaysian preschools. This study provides evidence that the children’s social interactions with peers and adults were mediated by the engagement of the children in the computer environments.Keywords: computer, play, preschool, social interaction
Procedia PDF Downloads 2992608 Addressing Scheme for IOT Network Using IPV6
Authors: H. Zormati, J. Chebil, J. Bel Hadj Taher
Abstract:
The goal of this paper is to present an addressing scheme that allows for assigning a unique IPv6 address to each node in the Internet of Things (IoT) network. This scheme guarantees uniqueness by extracting the clock skew of each communication device and converting it into an IPv6 address. Simulation analysis confirms that the presented scheme provides reductions in terms of energy consumption, communication overhead and response time as compared to four studied addressing schemes Strong DAD, LEADS, SIPA and CLOSA.Keywords: addressing, IoT, IPv6, network, nodes
Procedia PDF Downloads 2932607 Approach-Avoidance and Intrinsic-Extrinsic Motivation of Adolescent Computer Games Players
Authors: Monika Paleczna, Barbara Szmigielska
Abstract:
The period of adolescence is a time when young people are becoming more and more active and conscious users of the digital world. One of the most frequently undertaken activities by them is computer games. Young players can choose from a wide range of games, including action, adventure, strategy, and logic games. The main aim of this study is to answer the question about the motivation of teenage players. The basic question is what motivates young players to play computer games and what motivates them to play a particular game. Fifty adolescents aged 15-17 participated in the study. They completed a questionnaire in which they determined what motivates them to play, how often they play computer games, and what type of computer games they play most often. It was found that entertainment and learning English are among the most important motives. The most important specific features related to a given game are the knowledge of its previous parts and the ability to play for free. The motives chosen by the players will be described in relation to the concepts of internal and external as well as approach and avoidance motivation. An additional purpose of this study is to present data concerning preferences regarding the type of games and the amount of time they spend playing.Keywords: computer games, motivation, game preferences, adolescence
Procedia PDF Downloads 1842606 Students Competencies in the Use of Computer Assistive Technology at Akropong School for the Blind in the Eastern of Ghana
Authors: Joseph Ampratwum, Yaw Nyadu Offei, Afua Ntoaduro, Frank Twum
Abstract:
The use of computer assistive technology has captured the attention of individuals with visual impairment. Children with visual impairments who are tactual learners have one unique need which is quite different from all other disability groups. They depend on the use of computer assistive technology for reading, writing, receiving information and sending information as well. The objective of the study was to assess students’ competencies in the use of computer assistive technology at Akropong School for the Blind in Ghana. This became necessary because little research has been conducted to document the competencies and challenges in the use of computer among students with visual impairments in Africa. A case study design with a mixed research strategy was adopted for the study. A purposive sampling technique was used to sample 35 students from Akropong School for the Blind in the eastern region of Ghana. The researcher gathered both quantitative and qualitative data to measure students’ competencies in keyboarding skills and Job Access with Speech (JAWS), as well as the other challenges. The findings indicated that comparatively students’ competency in keyboard skills was higher than JAWS application use. Thus students had reached higher stages in the conscious competencies matrix in the former than the latter. It was generally noted that challenges limiting effective use of students’ competencies in computer assistive technology in the School were more personal than external influences. This was because most of the challenges were due to the individual response to the training and familiarity in developing their competencies in using computer assistive technology. Base on this it was recommended that efforts should be made to stock up the laboratory with additional computers. Directly in line with the first recommendation, it was further suggested that more practice time should be created for the students to maximize computer use. Also Licensed JAWS must be acquired by the school to advance students’ competence in using computer assistive technology.Keywords: computer assistive technology, job access with speech, keyboard, visual impairment
Procedia PDF Downloads 3392605 A Numerical Model for Simulation of Blood Flow in Vascular Networks
Authors: Houman Tamaddon, Mehrdad Behnia, Masud Behnia
Abstract:
An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.Keywords: blood flow, morphometric data, vascular tree, Strahler ordering system
Procedia PDF Downloads 2722604 Assessing Basic Computer Applications’ Skills of College-Level Students in Saudi Arabia
Authors: Mohammed A. Gharawi, Majed M. Khoja
Abstract:
This paper is a report on the findings of a study conducted at the Institute of Public Administration (IPA) in Saudi Arabia. The paper applied both qualitative and quantitative research methods to assess the levels of basic computer applications’ skills among students enrolled in the preparatory programs of the institution. qualitative data have been collected from semi-structured interviews with the instructors who have previously been assigned to teach Introduction to information technology courses. Quantitative data were collected by executing a self-report questionnaire and a written statistical test. 380 enrolled students responded to the questionnaire and 142 accomplished the statistical test. The results indicate the lack of necessary skills to deal with computer applications among most of the students who are enrolled in the IPA’s preparatory programs.Keywords: assessment, computer applications, computer literacy, Institute of Public Administration, Saudi Arabia
Procedia PDF Downloads 3152603 B4A Is One of the Best Programming Software for Surveyor Engineers
Authors: Ali Mohammadi
Abstract:
Many engineers use the programs that are installed on the computer, but with the arrival of the mobile phone and the possibility of designing apps, many Android programs can be designed similar to the programs that are installed on the computer, and from the mobile phone, in addition to communication Telephone and photography show a more practical use. Engineers are one of the groups that can use specialized apps to have less need to go to the office and computer, and b4a can be considered one of the simplest software for designing apps. This article introduces a number of surveying apps designed using b4a and the impact that using these apps has on productivity in this field of engineering.Keywords: app, tunnel, total station, map
Procedia PDF Downloads 482602 Computation and Validation of the Stress Distribution around a Circular Hole in a Slab Undergoing Plastic Deformation
Authors: Sherif D. El Wakil, John Rice
Abstract:
The aim of the current work was to employ the finite element method to model a slab, with a small hole across its width, undergoing plastic plane strain deformation. The computational model had, however, to be validated by comparing its results with those obtained experimentally. Since they were in good agreement, the finite element method can therefore be considered a reliable tool that can help gain better understanding of the mechanism of ductile failure in structural members having stress raisers. The finite element software used was ANSYS, and the PLANE183 element was utilized. It is a higher order 2-D, 8-node or 6-node element with quadratic displacement behavior. A bilinear stress-strain relationship was used to define the material properties, with constants similar to those of the material used in the experimental study. The model was run for several tensile loads in order to observe the progression of the plastic deformation region, and the stress concentration factor was determined in each case. The experimental study involved employing the visioplasticity technique, where a circular mesh (each circle was 0.5 mm in diameter, with 0.05 mm line thickness) was initially printed on the side of an aluminum slab having a small hole across its width. Tensile loading was then applied to produce a small increment of plastic deformation. Circles in the plastic region became ellipses, where the directions of the principal strains and stresses coincided with the major and minor axes of the ellipses. Next, we were able to determine the directions of the maximum and minimum shear stresses at the center of each ellipse, and the slip-line field was then constructed. We were then able to determine the stress at any point in the plastic deformation zone, and hence the stress concentration factor. The experimental results were found to be in good agreement with the analytical ones.Keywords: finite element method to model a slab, slab undergoing plastic deformation, stress distribution around a circular hole, visioplasticity
Procedia PDF Downloads 3192601 Heart Rate Variability Analysis for Early Stage Prediction of Sudden Cardiac Death
Authors: Reeta Devi, Hitender Kumar Tyagi, Dinesh Kumar
Abstract:
In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.Keywords: early stage prediction, heart rate variability, linear and non-linear analysis, sudden cardiac death
Procedia PDF Downloads 3392600 A Highly Efficient Broadcast Algorithm for Computer Networks
Authors: Ganesh Nandakumaran, Mehmet Karaata
Abstract:
A wave is a distributed execution, often made up of a broadcast phase followed by a feedback phase, requiring the participation of all the system processes before a particular event called decision is taken. Wave algorithms with one initiator such as the 1-wave algorithm have been shown to be very efficient for broadcasting messages in tree networks. Extensions of this algorithm broadcasting a sequence of waves using a single initiator have been implemented in algorithms such as the m-wave algorithm. However as the network size increases, having a single initiator adversely affects the message delivery times to nodes further away from the initiator. As a remedy, broadcast waves can be allowed to be initiated by multiple initiator nodes distributed across the network to reduce the completion time of broadcasts. These waves initiated by one or more initiator processes form a collection of waves covering the entire network. Solutions to global-snapshots, distributed broadcast and various synchronization problems can be solved efficiently using waves with multiple concurrent initiators. In this paper, we propose the first stabilizing multi-wave sequence algorithm implementing waves started by multiple initiator processes such that every process in the network receives at least one sequence of broadcasts. Due to being stabilizing, the proposed algorithm can withstand transient faults and do not require initialization. We view a fault as a transient fault if it perturbs the configuration of the system but not its program.Keywords: distributed computing, multi-node broadcast, propagation of information with feedback and cleaning (PFC), stabilization, wave algorithms
Procedia PDF Downloads 5042599 Implementation of Computer-Based Technologies into Foreign Language Teaching Process
Authors: Golovchun Aleftina, Dabyltayeva Raikhan
Abstract:
Nowadays, in the world of widely developing cross-cultural interactions and rapidly changing demands of the global labor market, foreign language teaching and learning has taken a special role not only in school education but also in everyday life. Cognitive Lingua-Cultural Methodology of Foreign Language Teaching originated in Kazakhstan brings a communicative approach to the forefront in foreign language teaching that gives raise a variety of techniques to make the language learning a real communication. One of these techniques is Computer Assisted Language Learning. In our article, we aim to: demonstrate what learning benefits students are likely to get by teachers having implemented computer-based technologies into foreign language teaching process; prove that technology-based classroom serves as the best tool for interactive and efficient language learning; give examples of classroom sufficient organization with computer-based activities.Keywords: computer assisted language learning, learning benefits, foreign language teaching process, implementation, communicative approach
Procedia PDF Downloads 4732598 Comparison of the Indocyanine Green Dye Method versus the Combined Method of Indigo Carmine Blue Dye with Indocyanine Green Fluorescence Imaging for Sentinel Lymph Node Biopsy in Breast Conservative Therapy for Early Breast Cancer
Authors: Nobuyuki Takemoto, Ai Koyanagi, Masanori Yasuda, Hiroshi Yamamoto
Abstract:
Background: Fluorescence imaging (FI) is one of the methods to identify sentinel lymph nodes (SLNs). However, the procedure is technically complicated and requires procedural skills, as SLN biopsy must be conducted in dim light conditions. As an improved version of this method, we introduced a combined method (Combined mixed dye and fluorescence; CMF) consisting of indigo carmine blue dye and FI. The direct visualization of SLNs under shadowless surgical light conditions is facilitated by the addition of the blue dye. We compared the SLN detection rates of CMF with that of the indocyanine green (ICG) dye method (ICG-D). Methods: A total of 202 patients with stage ≤ IIA breast cancer who underwent breast conservative therapy with separate incision from January 2004 to February 2017 were reviewed. Details of the two methods are as follows: (1) ICG-D: 2ml of ICG (10mg) was used and the green-stained SLNs were resected via a 3-4cm axillary incision; (2) CMF: A combination of 1ml of ICG (5mg) and 1-3ml of indigo carmine (4-12mg) was used. Using Photodynamic Eye (PDE), a 1.5-2 cm incision was made near the point of disappearance of the fluorescence and SLNs with intermediate color of blue and green were resected. Results: There were 92 ICG-D and 110 CMF cases. CMF resulted in a significantly higher detection rate than ICG-D (96.4% vs. 83.7%; p=0.003). This difference was particularly notable in those aged ≥ 60 years (98.3% vs. 74.3%) and individuals with BMI ≥ 25kg/m2 (90.3% vs. 58.3%). Conclusion: CMF is an effective method to identify SLNs which is safe, efficient, and cost-effective. Furthermore, radiation exposure can be avoided, and it can be performed in institutes without nuclear medicine facilities. CMF achieves a high SLN identification rate, and most of this procedure is feasible under shadowless surgical light conditions. CMF can reliably perform SLN biopsy even in those aged ≥ 60 years and individuals with BMI ≥ 25 kg/m2.Keywords: sentinel lymph node biopsy, identification rate, indocyanine green (ICG), indigocarmine, fluorescence
Procedia PDF Downloads 1712597 Computer Software for Calculating Electron Mobility of Semiconductors Compounds; Case Study for N-Gan
Authors: Emad A. Ahmed
Abstract:
Computer software to calculate electron mobility with respect to different scattering mechanism has been developed. This software is adopted completely Graphical User Interface (GUI) technique and its interface has been designed by Microsoft Visual Basic 6.0. As a case study the electron mobility of n-GaN was performed using this software. The behaviour of the mobility for n-GaN due to elastic scattering processes and its relation to temperature and doping concentration were discussed. The results agree with other available theoretical and experimental data.Keywords: electron mobility, relaxation time, GaN, scattering, computer software, computation physics
Procedia PDF Downloads 6702596 A Cooperative Signaling Scheme for Global Navigation Satellite Systems
Authors: Keunhong Chae, Seokho Yoon
Abstract:
Recently, the global navigation satellite system (GNSS) such as Galileo and GPS is employing more satellites to provide a higher degree of accuracy for the location service, thus calling for a more efficient signaling scheme among the satellites used in the overall GNSS network. In that the network throughput is improved, the spatial diversity can be one of the efficient signaling schemes; however, it requires multiple antenna that could cause a significant increase in the complexity of the GNSS. Thus, a diversity scheme called the cooperative signaling was proposed, where the virtual multiple-input multiple-output (MIMO) signaling is realized with using only a single antenna in the transmit satellite of interest and with modeling the neighboring satellites as relay nodes. The main drawback of the cooperative signaling is that the relay nodes receive the transmitted signal at different time instants, i.e., they operate in an asynchronous way, and thus, the overall performance of the GNSS network could degrade severely. To tackle the problem, several modified cooperative signaling schemes were proposed; however, all of them are difficult to implement due to a signal decoding at the relay nodes. Although the implementation at the relay nodes could be simpler to some degree by employing the time-reversal and conjugation operations instead of the signal decoding, it would be more efficient if we could implement the operations of the relay nodes at the source node having more resources than the relay nodes. So, in this paper, we propose a novel cooperative signaling scheme, where the data signals are combined in a unique way at the source node, thus obviating the need of the complex operations such as signal decoding, time-reversal and conjugation at the relay nodes. The numerical results confirm that the proposed scheme provides the same performance in the cooperative diversity and the bit error rate (BER) as the conventional scheme, while reducing the complexity at the relay nodes significantly. Acknowledgment: This work was supported by the National GNSS Research Center program of Defense Acquisition Program Administration and Agency for Defense Development.Keywords: global navigation satellite network, cooperative signaling, data combining, nodes
Procedia PDF Downloads 2802595 The Effect of Computer-Mediated vs. Face-to-Face Instruction on L2 Pragmatics: A Meta-Analysis
Authors: Marziyeh Yousefi, Hossein Nassaji
Abstract:
This paper reports the results of a meta-analysis of studies on the effects of instruction mode on learning second language pragmatics during the last decade (from 2006 to 2016). After establishing related inclusion/ exclusion criteria, 39 published studies were retrieved and included in the present meta-analysis. Studies were later coded for face-to-face and computer-assisted mode of instruction. Statistical procedures were applied to obtain effect sizes. It was found that Computer-Assisted-Language-Learning studies generated larger effects than Face-to-Face instruction.Keywords: meta-analysis, effect size, L2 pragmatics, comprehensive meta-analysis, face-to-face, computer-assisted language learning
Procedia PDF Downloads 2212594 A Design System for Complex Profiles of Machine Members Using a Synthetic Curve
Authors: N. Sateesh, C. S. P. Rao, K. Satyanarayana, C. Rajashekar
Abstract:
This paper proposes a development of a CAD/CAM system for complex profiles of various machine members using a synthetic curve i.e. B-spline. Conventional methods in designing and manufacturing of complex profiles are tedious and time consuming. Even programming those on a computer numerical control (CNC) machine can be a difficult job because of the complexity of the profiles. The system developed provides graphical and numerical representation B-spline profile for any given input. In this paper, the system is applicable to represent a cam profile with B-spline and attempt is made to improve the follower motion.Keywords: plate-cams, cam profile, b-spline, computer numerical control (CNC), computer aided design and computer aided manufacturing (CAD/CAM), R-D-R-D (rise-dwell-return-dwell)
Procedia PDF Downloads 6112593 Fog Computing- Network Based Computing
Authors: Navaneeth Krishnan, Chandan N. Bhagwat, Aparajit P. Utpat
Abstract:
Cloud Computing provides us a means to upload data and use applications over the internet. As the number of devices connecting to the cloud grows, there is undue pressure on the cloud infrastructure. Fog computing or Network Based Computing or Edge Computing allows to move a part of the processing in the cloud to the network devices present along the node to the cloud. Therefore the nodes connected to the cloud have a better response time. This paper proposes a method of moving the computation from the cloud to the network by introducing an android like appstore on the networking devices.Keywords: cloud computing, fog computing, network devices, appstore
Procedia PDF Downloads 3872592 Analyzing the Street Pattern Characteristics on Young People’s Choice to Walk or Not: A Study Based on Accelerometer and Global Positioning Systems Data
Authors: Ebru Cubukcu, Gozde Eksioglu Cetintahra, Burcin Hepguzel Hatip, Mert Cubukcu
Abstract:
Obesity and overweight cause serious health problems. Public and private organizations aim to encourage walking in various ways in order to cope with the problem of obesity and overweight. This study aims to understand how the spatial characteristics of urban street pattern, connectivity and complexity influence young people’s choice to walk or not. 185 public university students in Izmir, the third largest city in Turkey, participated in the study. Each participant had worn an accelerometer and a global positioning (GPS) device for a week. The accelerometer device records data on the intensity of the participant’s activity at a specified time interval, and the GPS device on the activities’ locations. Combining the two datasets, activity maps are derived. These maps are then used to differentiate the participants’ walk trips and motor vehicle trips. Given that, the frequency of walk and motor vehicle trips are calculated at the street segment level, and the street segments are then categorized into two as ‘preferred by pedestrians’ and ‘preferred by motor vehicles’. Graph Theory-based accessibility indices are calculated to quantify the spatial characteristics of the streets in the sample. Six different indices are used: (I) edge density, (II) edge sinuosity, (III) eta index, (IV) node density, (V) order of a node, and (VI) beta index. T-tests show that the index values for the ‘preferred by pedestrians’ and ‘preferred by motor vehicles’ are significantly different. The findings indicate that the spatial characteristics of the street network have a measurable effect on young people’s choice to walk or not. Policy implications are discussed. This study is funded by the Scientific and Technological Research Council of Turkey, Project No: 116K358.Keywords: graph theory, walkability, accessibility, street network
Procedia PDF Downloads 2252591 Pathologies in the Left Atrium Reproduced Using a Low-Order Synergistic Numerical Model of the Cardiovascular System
Authors: Nicholas Pearce, Eun-jin Kim
Abstract:
Pathologies of the cardiovascular (CV) system remain a serious and deadly health problem for human society. Computational modelling provides a relatively accessible tool for diagnosis, treatment, and research into CV disorders. However, numerical models of the CV system have largely focused on the function of the ventricles, frequently overlooking the behaviour of the atria. Furthermore, in the study of the pressure-volume relationship of the heart, which is a key diagnosis of cardiac vascular pathologies, previous works often evoke popular yet questionable time-varying elastance (TVE) method that imposes the pressure-volume relationship instead of calculating it consistently. Despite the convenience of the TVE method, there have been various indications of its limitations and the need for checking its validity in different scenarios. A model of the combined left ventricle (LV) and left atrium (LA) is presented, which consistently considers various feedback mechanisms in the heart without having to use the TVE method. Specifically, a synergistic model of the left ventricle is extended and modified to include the function of the LA. The synergy of the original model is preserved by modelling the electro-mechanical and chemical functions of the micro-scale myofiber for the LA and integrating it with the microscale and macro-organ-scale heart dynamics of the left ventricle and CV circulation. The atrioventricular node function is included and forms the conduction pathway for electrical signals between the atria and ventricle. The model reproduces the essential features of LA behaviour, such as the two-phase pressure-volume relationship and the classic figure of eight pressure-volume loops. Using this model, disorders in the internal cardiac electrical signalling are investigated by recreating the mechano-electric feedback (MEF), which is impossible where the time-varying elastance method is used. The effects of AV node block and slow conduction are then investigated in the presence of an atrial arrhythmia. It is found that electrical disorders and arrhythmia in the LA degrade the CV system by reducing the cardiac output, power, and heart rate.Keywords: cardiovascular system, left atrium, numerical model, MEF
Procedia PDF Downloads 1142590 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study
Authors: Almudena Konrad, Tomás Galguera
Abstract:
Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.Keywords: computational thinking, computing education, computer programming curriculum, logic, teaching methods
Procedia PDF Downloads 3162589 A Review: Detection and Classification Defects on Banana and Apples by Computer Vision
Authors: Zahow Muoftah
Abstract:
Traditional manual visual grading of fruits has been one of the agricultural industry’s major challenges due to its laborious nature as well as inconsistency in the inspection and classification process. The main requirements for computer vision and visual processing are some effective techniques for identifying defects and estimating defect areas. Automated defect detection using computer vision and machine learning has emerged as a promising area of research with a high and direct impact on the visual inspection domain. Grading, sorting, and disease detection are important factors in determining the quality of fruits after harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have been conducted to identify diseases and pests that affect the fruits of agricultural crops. However, most previous studies concentrated solely on the diagnosis of a lesion or disease. This study focused on a comprehensive study to identify pests and diseases of apple and banana fruits using detection and classification defects on Banana and Apples by Computer Vision. As a result, the current article includes research from these domains as well. Finally, various pattern recognition techniques for detecting apple and banana defects are discussed.Keywords: computer vision, banana, apple, detection, classification
Procedia PDF Downloads 1062588 The Influence of E-Learning on Teachers and Students Educational Interactions in Tehran City
Authors: Hadi Manjiri, Mahdyeh Bakhshi, Ali Jafari, Maryam Salati
Abstract:
This study investigates the influence of e-learning on teacher-student instructional interactions through the mediating role of computer literacy among elementary school teachers in Tehran. The research method is a survey that was conducted among elementary school students in Tehran. A sample size of 338 was determined based on Morgan's table. A stratified random sampling method was used to select 228 women and 110 men for the study. Bagherpour et al.'s computer literacy questionnaire, Elahi et al.'s e-learning questionnaire, and Lourdusamy and Khine's questionnaire on teacher-student instructional interactions were used to measure the variables. The data were analyzed using SPSS and LISREL software. It was found that e-learning affects teacher-student instructional interactions, mediated by teachers' computer literacy. In addition, the results suggest that e-learning predicts a 0.66 change in teacher-student instructional interactions, while computer literacy predicts a 0.56 change in instructional interactions between teachers and students.Keywords: e-learning, instructional interactions, computer literacy, students
Procedia PDF Downloads 1182587 Discerning Divergent Nodes in Social Networks
Authors: Mehran Asadi, Afrand Agah
Abstract:
In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.Keywords: online social networks, data mining, social cloud computing, interaction and collaboration
Procedia PDF Downloads 1572586 Patterns of Occurrence of Bovine Haemoparasitic Diseases and Its Co-Incidence with Viral Epidemics of Foot and Mouth Disease and Lumpy Skin Disease
Authors: Amir Hamed Abd-Elrahman, Mohamed Bessat
Abstract:
450 fattening cattle and buffaloes aged from 6 to 30 months old were examined clinically to determine patterns of occurrence of hemoparasitic diseases and the efficacy of different anti theilerial drugs. 420 animals examined clinically to determine relation between different outbreak of FMD and LSD in Egypt 2012- 2013 and haemoprotozoal diseases. The clinical pictures of haemoprotozoal diseases are variable, from sever to mild, depending on the endemic situation which governed by frequent previous exposure and tick infestation. B. bigemina is the most common haemoprotozoal diseases in the area of study and the infection rate in a descending manner for B. bigemina, A. marginale and T. annulata were 20%, 9.7% and 6.6% respectively. The species susceptibility of B. bigemina and T. annulata showed a higher incidence in cattle than buffaloes while in A. marginale showed a little difference in cattle and buffaloes susceptibility by 10% and 9.2% respectively. The breed susceptibility of B. bigemina and T. annulata showed a higher incidence in crossbred cattle than native baladi cattle while A. marginale showed a higher incidence in native baladi cattle than crossbred cattle. The maximal infection rates were recorded during summer months. The infection rates of B. bigemina and A. marginale were higher among young animals over 6 months and declined above 2 year old while in T. annulata the infection rates were lower among young animals and increased above 2 year old. The case fatality of T. annulata was higher than A. marginale and B. bigemina. Efficacy of different anti theilerial drugs were studied, cure rate of chlouroquine group and Butalex group were 60% disappearance of schizont in lymph node smear after 9 days and 5 days respectively while cure rate of Oxytetracycline Dihydrate (Alamycine) group 20% with disappearance of schizont in lymph node smear after 14 days. FMD and LSD infection enhancement the occurrence of bovine haemoprotozoal diseases.Keywords: Babesia bigemina, Anaplasma marginale, Theileria annulata, FMD, LSD, ephemeral fever
Procedia PDF Downloads 3282585 Isolated and Combined Effects of Multimedia Computer Assisted Coaching and Traditional Coaching on Motor Ability Component and Physiological Variables among Sports School Basketball Players
Authors: Biju Lukose
Abstract:
The objective of the study was to identify the isolated and combined effect of multi-media computer assisted coaching and traditional coaching on selected motor ability component and physiological variables among sports school basketball players. Forty male basketball players aged between 14 to 18 years were selected randomly. They were divided into four groups of three experimental and one control. Isolated multi-media computer assisted coaching, isolated traditional coaching and combined coaching (multimedia computer assisted coaching and traditional coaching) are the three experimental groups. All the three experimental groups were given coaching for 24 weeks and control group were not allowed to participate in any coaching programme. The subjects were tested dependent variables such as speed and cardio vascular endurance; at the beginning (pre-test) in middle 12 week (mid-test) and after the coaching 24 week (post-test). The coaching schedule was for a period of 24 weeks. The data were collected two days before and after the coaching schedule and mid test after the 12 weeks of the coaching schedule. The data were analysed by applying ANCOVA and Scheffe’s Post hoc test. The result showed that there were significant changes in dependent variables such as speed and cardio vascular endurance. The results of the study showed that combined coaching (multimedia computer assisted coaching and traditional coaching) is more superior to traditional coaching and multimedia computer assisted coaching groups and no significant change in speed in the case of isolated multimedia computer assisted coaching group.Keywords: computer, computer-assisted coaching, multimedia coaching, traditional coaching
Procedia PDF Downloads 4582584 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy
Authors: M. Regina Carreira-Lopez
Abstract:
Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.Keywords: hypernymy, information retrieval, lightweight ontology, resonance
Procedia PDF Downloads 1252583 Design and Simulation Interface Circuit for Piezoresistive Accelerometers with Offset Cancellation Ability
Authors: Mohsen Bagheri, Ahmad Afifi
Abstract:
This paper presents a new method for read out of the piezoresistive accelerometer sensors. The circuit works based on instrumentation amplifier and it is useful for reducing offset in Wheatstone bridge. The obtained gain is 645 with 1 μv/°c equivalent drift and 1.58 mw power consumption. A Schmitt trigger and multiplexer circuit control output node. A high speed counter is designed in this work. The proposed circuit is designed and simulated in 0.18 μm CMOS technology with 1.8 v power supply.Keywords: piezoresistive accelerometer, zero offset, Schmitt trigger, bidirectional reversible counter
Procedia PDF Downloads 310