Search results for: Cloud computing
751 Patient-Friendly Hand Gesture Recognition Using AI
Authors: K. Prabhu, K. Dinesh, M. Ranjani, M. Suhitha
Abstract:
During the tough times of covid, those people who were hospitalized found it difficult to always convey what they wanted to or needed to the attendee. Sometimes the attendees might also not be there. In that case, the patients can use simple hand gestures to control electrical appliances (like its set it for a zero watts bulb)and three other gestures for voice note intimation. In this AI-based hand recognition project, NodeMCU is used for the control action of the relay, and it is connected to the firebase for storing the value in the cloud and is interfaced with the python code via raspberry pi. For three hand gestures, a voice clip is added for intimation to the attendee. This is done with the help of Google’s text to speech and the inbuilt audio file option in the raspberry pi 4. All the five gestures will be detected when shown with their hands via the webcam, which is placed for gesture detection. The personal computer is used for displaying the gestures and for running the code in the raspberry pi imager.Keywords: nodeMCU, AI technology, gesture, patient
Procedia PDF Downloads 166750 Local Homology Modules
Authors: Fatemeh Mohammadi Aghjeh Mashhad
Abstract:
In this paper, we give several ways for computing generalized local homology modules by using Gorenstein flat resolutions. Also, we find some bounds for vanishing of generalized local homology modules.Keywords: a-adic completion functor, generalized local homology modules, Gorenstein flat modules
Procedia PDF Downloads 419749 Heat Transfer and Diffusion Modelling
Authors: R. Whalley
Abstract:
The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.Keywords: heat, transfer, diffusion, modelling, computation
Procedia PDF Downloads 553748 Innovations and Challenges: Multimodal Learning in Cybersecurity
Authors: Tarek Saadawi, Rosario Gennaro, Jonathan Akeley
Abstract:
There is rapidly growing demand for professionals to fill positions in Cybersecurity. This is recognized as a national priority both by government agencies and the private sector. Cybersecurity is a very wide technical area which encompasses all measures that can be taken in an electronic system to prevent criminal or unauthorized use of data and resources. This requires defending computers, servers, networks, and their users from any kind of malicious attacks. The need to address this challenge has been recognized globally but is particularly acute in the New York metropolitan area, home to some of the largest financial institutions in the world, which are prime targets of cyberattacks. In New York State alone, there are currently around 57,000 jobs in the Cybersecurity industry, with more than 23,000 unfilled positions. The Cybersecurity Program at City College is a collaboration between the Departments of Computer Science and Electrical Engineering. In Fall 2020, The City College of New York matriculated its first students in theCybersecurity Master of Science program. The program was designed to fill gaps in the previous offerings and evolved out ofan established partnership with Facebook on Cybersecurity Education. City College has designed a program where courses, curricula, syllabi, materials, labs, etc., are developed in cooperation and coordination with industry whenever possible, ensuring that students graduating from the program will have the necessary background to seamlessly segue into industry jobs. The Cybersecurity Program has created multiple pathways for prospective students to obtain the necessary prerequisites to apply in order to build a more diverse student population. The program can also be pursued on a part-time basis which makes it available to working professionals. Since City College’s Cybersecurity M.S. program was established to equip students with the advanced technical skills needed to thrive in a high-demand, rapidly-evolving field, it incorporates a range of pedagogical formats. From its outset, the Cybersecurity program has sought to provide both the theoretical foundations necessary for meaningful work in the field along with labs and applied learning projects aligned with skillsets required by industry. The efforts have involved collaboration with outside organizations and with visiting professors designing new courses on topics such as Adversarial AI, Data Privacy, Secure Cloud Computing, and blockchain. Although the program was initially designed with a single asynchronous course in the curriculum with the rest of the classes designed to be offered in-person, the advent of the COVID-19 pandemic necessitated a move to fullyonline learning. The shift to online learning has provided lessons for future development by providing examples of some inherent advantages to the medium in addition to its drawbacks. This talk will address the structure of the newly-implemented Cybersecurity Master’s Program and discuss the innovations, challenges, and possible future directions.Keywords: cybersecurity, new york, city college, graduate degree, master of science
Procedia PDF Downloads 147747 Video Analytics on Pedagogy Using Big Data
Authors: Jamuna Loganath
Abstract:
Education is the key to the development of any individual’s personality. Today’s students will be tomorrow’s citizens of the global society. The education of the student is the edifice on which his/her future will be built. Schools therefore should provide an all-round development of students so as to foster a healthy society. The behaviors and the attitude of the students in school play an essential role for the success of the education process. Frequent reports of misbehaviors such as clowning, harassing classmates, verbal insults are becoming common in schools today. If this issue is left unattended, it may develop a negative attitude and increase the delinquent behavior. So, the need of the hour is to find a solution to this problem. To solve this issue, it is important to monitor the students’ behaviors in school and give necessary feedback and mentor them to develop a positive attitude and help them to become a successful grownup. Nevertheless, measuring students’ behavior and attitude is extremely challenging. None of the present technology has proven to be effective in this measurement process because actions, reactions, interactions, response of the students are rarely used in the course of the data due to complexity. The purpose of this proposal is to recommend an effective supervising system after carrying out a feasibility study by measuring the behavior of the Students. This can be achieved by equipping schools with CCTV cameras. These CCTV cameras installed in various schools of the world capture the facial expressions and interactions of the students inside and outside their classroom. The real time raw videos captured from the CCTV can be uploaded to the cloud with the help of a network. The video feeds get scooped into various nodes in the same rack or on the different racks in the same cluster in Hadoop HDFS. The video feeds are converted into small frames and analyzed using various Pattern recognition algorithms and MapReduce algorithm. Then, the video frames are compared with the bench marking database (good behavior). When misbehavior is detected, an alert message can be sent to the counseling department which helps them in mentoring the students. This will help in improving the effectiveness of the education process. As Video feeds come from multiple geographical areas (schools from different parts of the world), BIG DATA helps in real time analysis as it analyzes computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions. It also analyzes data that can’t be analyzed by traditional software applications such as RDBMS, OODBMS. It has also proven successful in handling human reactions with ease. Therefore, BIG DATA could certainly play a vital role in handling this issue. Thus, effectiveness of the education process can be enhanced with the help of video analytics using the latest BIG DATA technology.Keywords: big data, cloud, CCTV, education process
Procedia PDF Downloads 240746 Study on the Geometric Similarity in Computational Fluid Dynamics Calculation and the Requirement of Surface Mesh Quality
Authors: Qian Yi Ooi
Abstract:
At present, airfoil parameters are still designed and optimized according to the scale of conventional aircraft, and there are still some slight deviations in terms of scale differences. However, insufficient parameters or poor surface mesh quality is likely to occur if these small deviations are embedded in a future civil aircraft with a size that is quite different from conventional aircraft, such as a blended-wing-body (BWB) aircraft with future potential, resulting in large deviations in geometric similarity in computational fluid dynamics (CFD) simulations. To avoid this situation, the study on the CFD calculation on the geometric similarity of airfoil parameters and the quality of the surface mesh is conducted to obtain the ability of different parameterization methods applied on different airfoil scales. The research objects are three airfoil scales, including the wing root and wingtip of conventional civil aircraft and the wing root of the giant hybrid wing, used by three parameterization methods to compare the calculation differences between different sizes of airfoils. In this study, the constants including NACA 0012, a Reynolds number of 10 million, an angle of attack of zero, a C-grid for meshing, and the k-epsilon (k-ε) turbulence model are used. The experimental variables include three airfoil parameterization methods: point cloud method, B-spline curve method, and class function/shape function transformation (CST) method. The airfoil dimensions are set to 3.98 meters, 17.67 meters, and 48 meters, respectively. In addition, this study also uses different numbers of edge meshing and the same bias factor in the CFD simulation. Studies have shown that with the change of airfoil scales, different parameterization methods, the number of control points, and the meshing number of divisions should be used to improve the accuracy of the aerodynamic performance of the wing. When the airfoil ratio increases, the most basic point cloud parameterization method will require more and larger data to support the accuracy of the airfoil’s aerodynamic performance, which will face the severe test of insufficient computer capacity. On the other hand, when using the B-spline curve method, average number of control points and meshing number of divisions should be set appropriately to obtain higher accuracy; however, the quantitative balance cannot be directly defined, but the decisions should be made repeatedly by adding and subtracting. Lastly, when using the CST method, it is found that limited control points are enough to accurately parameterize the larger-sized wing; a higher degree of accuracy and stability can be obtained by using a lower-performance computer.Keywords: airfoil, computational fluid dynamics, geometric similarity, surface mesh quality
Procedia PDF Downloads 222745 Factors Affecting Air Surface Temperature Variations in the Philippines
Authors: John Christian Lequiron, Gerry Bagtasa, Olivia Cabrera, Leoncio Amadore, Tolentino Moya
Abstract:
Changes in air surface temperature play an important role in the Philippine’s economy, industry, health, and food production. While increasing global mean temperature in the recent several decades has prompted a number of climate change and variability studies in the Philippines, most studies still focus on rainfall and tropical cyclones. This study aims to investigate the trend and variability of observed air surface temperature and determine its major influencing factor/s in the Philippines. A non-parametric Mann-Kendall trend test was applied to monthly mean temperature of 17 synoptic stations covering 56 years from 1960 to 2015 and a mean change of 0.58 °C or a positive trend of 0.0105 °C/year (p < 0.05) was found. In addition, wavelet decomposition was used to determine the frequency of temperature variability show a 12-month, 30-80-month and more than 120-month cycles. This indicates strong annual variations, interannual variations that coincide with ENSO events, and interdecadal variations that are attributed to PDO and CO2 concentrations. Air surface temperature was also correlated with smoothed sunspot number and galactic cosmic rays, the results show a low to no effect. The influence of ENSO teleconnection on temperature, wind pattern, cloud cover, and outgoing longwave radiation on different ENSO phases had significant effects on regional temperature variability. Particularly, an anomalous anticyclonic (cyclonic) flow east of the Philippines during the peak and decay phase of El Niño (La Niña) events leads to the advection of warm southeasterly (cold northeasterly) air mass over the country. Furthermore, an apparent increasing cloud cover trend is observed over the West Philippine Sea including portions of the Philippines, and this is believed to lessen the effect of the increasing air surface temperature. However, relative humidity was also found to be increasing especially on the central part of the country, which results in a high positive trend of heat index, exacerbating the effects on human discomfort. Finally, an assessment of gridded temperature datasets was done to look at the viability of using three high-resolution datasets in future climate analysis and model calibration and verification. Several error statistics (i.e. Pearson correlation, Bias, MAE, and RMSE) were used for this validation. Results show that gridded temperature datasets generally follows the observed surface temperature change and anomalies. In addition, it is more representative of regional temperature rather than a substitute to station-observed air temperature.Keywords: air surface temperature, carbon dioxide, ENSO, galactic cosmic rays, smoothed sunspot number
Procedia PDF Downloads 323744 Design and Development of Automatic Onion Harvester
Authors: P. Revathi, T. Mrunalini, K. Padma Priya, P. Ramya, R. Saranya
Abstract:
During the tough times of covid, those people who were hospitalized found it difficult to always convey what they wanted to or needed to the attendee. Sometimes the attendees might also not be there. In that case, the patients can use simple hand gestures to control electrical appliances (like its set it for a zero watts bulb)and three other gestures for voice note intimation. In this AI-based hand recognition project, NodeMCU is used for the control action of the relay, and it is connected to the firebase for storing the value in the cloud and is interfaced with the python code via raspberry pi. For three hand gestures, a voice clip is added for intimation to the attendee. This is done with the help of Google’s text to speech and the inbuilt audio file option in the raspberry pi 4. All the 5 gestures will be detected when shown with their hands via a webcam which is placed for gesture detection. A personal computer is used for displaying the gestures and for running the code in the raspberry pi imager.Keywords: onion harvesting, automatic pluging, camera, raspberry pi
Procedia PDF Downloads 198743 Economized Sensor Data Processing with Vehicle Platooning
Authors: Henry Hexmoor, Kailash Yelasani
Abstract:
We present vehicular platooning as a special case of crowd-sensing framework where sharing sensory information among a crowd is used for their collective benefit. After offering an abstract policy that governs processes involving a vehicular platoon, we review several common scenarios and components surrounding vehicular platooning. We then present a simulated prototype that illustrates efficiency of road usage and vehicle travel time derived from platooning. We have argued that one of the paramount benefits of platooning that is overlooked elsewhere, is the substantial computational savings (i.e., economizing benefits) in acquisition and processing of sensory data among vehicles sharing the road. The most capable vehicle can share data gathered from its sensors with nearby vehicles grouped into a platoon.Keywords: cloud network, collaboration, internet of things, social network
Procedia PDF Downloads 194742 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 190741 Analyzing the Construction of Collective Memories by History Movies/TV Programs: Case Study of Masters in the Forbidden City
Authors: Lulu Wang, Yongjun Xu, Xiaoyang Qiao
Abstract:
The Forbidden City is well known for being full of Chinese cultural and historical relics. However, the Masters in the Forbidden City, a documentary film, doesn’t just dwell on the stories of the past. Instead, it focuses on ordinary people—the restorers of the relics and antiquities, which has caught the sight of Chinese audiences. From this popular documentary film, a new way can be considered, that is to show the relics, antiquities and painting with a character of modern humanities by films and TV programs. Of course, it can’t just like a simple explanation from tour guides in museums. It should be a perfect combination of scenes, heritages, stories, storytellers and background music. All we want to do is trying to dig up the humanity behind the heritages and then create a virtual scene for the audience to have emotional resonance from the humanity. It is believed that there are two problems. One is that compared with the entertainment shows, why people prefer to see the boring restoration work. The other is that what the interaction is between those history documentary films, the heritages, the audiences and collective memory. This paper mainly used the methods of text analysis and data analysis. The audiences’ comment texts were collected from all kinds of popular video sites. Through analyzing those texts, there was a word cloud chart about people preferring to use what kind of words to comment the film. Then the usage rate of all comments words was calculated. After that, there was a Radar Chart to show the rank results. Eventually, each of them was given an emotional value classification according their comment tone and content. Based on the above analysis results, an interaction model among the audience, history films/TV programs and the collective memory can be summarized. According to the word cloud chart, people prefer to use such words to comment, including moving, history, love, family, celebrity, tone... From those emotional words, we can see Chinese audience felt so proud and shared the sense of Collective Identity, so they leave such comments: To our great motherland! Chinese traditional culture is really profound! It is found that in the construction of collective memory symbology, the films formed an imaginary system by organizing a ‘personalized audience’. The audience is not just a recipient of information, but a participant of the documentary films and a cooperator of collective memory. At the same time, it is believed that the traditional background music, the spectacular present scenes and the tone of the storytellers/hosts are also important, so it is suggested that the museums could try to cooperate with the producers of movie and TV program to create a vivid scene for the people. Maybe it’s a more artistic way for heritages to be open to all the world.Keywords: audience, heritages, history movies, TV programs
Procedia PDF Downloads 161740 Geoinformation Technology of Agricultural Monitoring Using Multi-Temporal Satellite Imagery
Authors: Olena Kavats, Dmitry Khramov, Kateryna Sergieieva, Vladimir Vasyliev, Iurii Kavats
Abstract:
Geoinformation technologies of space agromonitoring are a means of operative decision making support in the tasks of managing the agricultural sector of the economy. Existing technologies use satellite images in the optical range of electromagnetic spectrum. Time series of optical images often contain gaps due to the presence of clouds and haze. A geoinformation technology is created. It allows to fill gaps in time series of optical images (Sentinel-2, Landsat-8, PROBA-V, MODIS) with radar survey data (Sentinel-1) and use information about agrometeorological conditions of the growing season for individual monitoring years. The technology allows to perform crop classification and mapping for spring-summer (winter and spring crops) and autumn-winter (winter crops) periods of vegetation, monitoring the dynamics of crop state seasonal changes, crop yield forecasting. Crop classification is based on supervised classification algorithms, takes into account the peculiarities of crop growth at different vegetation stages (dates of sowing, emergence, active vegetation, and harvesting) and agriculture land state characteristics (row spacing, seedling density, etc.). A catalog of samples of the main agricultural crops (Ukraine) is created and crop spectral signatures are calculated with the preliminary removal of row spacing, cloud cover, and cloud shadows in order to construct time series of crop growth characteristics. The obtained data is used in grain crop growth tracking and in timely detection of growth trends deviations from reference samples of a given crop for a selected date. Statistical models of crop yield forecast are created in the forms of linear and nonlinear interconnections between crop yield indicators and crop state characteristics (temperature, precipitation, vegetation indices, etc.). Predicted values of grain crop yield are evaluated with an accuracy up to 95%. The developed technology was used for agricultural areas monitoring in a number of Great Britain and Ukraine regions using EOS Crop Monitoring Platform (https://crop-monitoring.eos.com). The obtained results allow to conclude that joint use of Sentinel-1 and Sentinel-2 images improve separation of winter crops (rapeseed, wheat, barley) in the early stages of vegetation (October-December). It allows to separate successfully the soybean, corn, and sunflower sowing areas that are quite similar in their spectral characteristics.Keywords: geoinformation technology, crop classification, crop yield prediction, agricultural monitoring, EOS Crop Monitoring Platform
Procedia PDF Downloads 456739 Methods for Solving Identification Problems
Authors: Fadi Awawdeh
Abstract:
In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing
Procedia PDF Downloads 481738 Effect of Viscosity in Void Structure with Interacting Variable Charge Dust Grains
Authors: Nebbat El Amine
Abstract:
The void is a dust free region inside the dust cloud in the plasma. It is found that the dust grain charge variation lead to the extension of the void. Moreover, for bigger dust grains, it is seen that the wave-like structure recedes when charge variation is dealt with. Furthermore, as the grain-grain distance is inversely proportional to density, the grain-grain interaction gets more important for a denser dust population and is to be included in momentum equation. For the result indicate above, the plasma is considered non viscous. But in fact, it’s not always true. Some authors measured experimentally the viscosity of this background and found that the viscosity of dusty plasma increase with background gas pressure. In this paper, we tack account the viscosity of the fluid, and we compare the result with that found in the recent work.Keywords: voids, dusty plasmas, variable charge, viscosity
Procedia PDF Downloads 89737 Estimation of Global and Diffuse Solar Radiation Over Two Cities of Sindh, Pakistan
Authors: M. A. Ahmed, Sidra A. Shaikh, M. W. Akhtar
Abstract:
Global and Diffuse Solar radiation on horizontal surface over two cities of Sindh, namely Jacobabad and Rohri were carried out using sunshine hour data of the area to assess the feasibility of solar energy utilization in Sindh province. The result obtained shows a high variation in direct and diffuse component of solar radiation in summer and winter months (80% direct and 20% diffuse). The contribution of diffuse solar radiation is low even in monsoon months i.e. July and August. The appearance of cloud is rare even in monsoon months. The estimated value indicates that this part of Sindh has higher solar potential and solar panels can be used for power generation. The solar energy can be utilized throughout the year in this part of Sindh, Pakistan.Keywords: solar potential over Sindh, global and diffuse solar radiation, radiation over two cities of Sindh, environmental engineering
Procedia PDF Downloads 446736 Bypassing Docker Transport Layer Security Using Remote Code Execution
Authors: Michael J. Hahn
Abstract:
Docker is a powerful tool used by many companies such as PayPal, MetLife, Expedia, Visa, and many others. Docker works by bundling multiple applications, binaries, and libraries together on top of an operating system image called a container. The container runs on a Docker engine that in turn runs on top of a standard operating system. This centralization saves a lot of system resources. In this paper, we will be demonstrating how to bypass Transport Layer Security and execute remote code within Docker containers built on a base image of Alpine Linux version 3.7.0 through the use of .apk files due to flaws in the Alpine Linux package management program. This exploit renders any applications built using Docker with a base image of Alpine Linux vulnerable to unwanted outside forces.Keywords: cloud, cryptography, Docker, Linux, security
Procedia PDF Downloads 198735 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data
Authors: Salam Khalifa, Naveed Ahmed
Abstract:
We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation
Procedia PDF Downloads 373734 ABET Accreditation Process for Engineering and Technology Programs: Detailed Process Flow from Criteria 1 to Criteria 8
Authors: Amit Kumar, Rajdeep Chakrabarty, Ganesh Gupta
Abstract:
This paper illustrates the detailed accreditation process of Accreditation Board of Engineering and Technology (ABET) for accrediting engineering and Technology programs. ABET is a non-governmental agency that accredits engineering and technology, applied and natural sciences, and computing sciences programs. ABET was founded on 10th May 1932 and was founded by Institute of Electrical and Electronics Engineering. International industries accept ABET accredited institutes having the highest standards in their academic programs. In this accreditation, there are eight criteria in general; criterion 1 describes the student outcome evaluations, criteria 2 measures the program's educational objectives, criteria 3 is the student outcome calculated from the marks obtained by students, criteria 4 establishes continuous improvement, criteria 5 focus on curriculum of the institute, criteria 6 is about faculties of this institute, criteria 7 measures the facilities provided by the institute and finally, criteria 8 focus on institutional support towards staff of the institute. In this paper, we focused on the calculative part of each criterion with equations and suitable examples, the files and documentation required for each criterion, and the total workflow of the process. The references and the values used to illustrate the calculations are all taken from the samples provided at ABET's official website. In the final section, we also discuss the criterion-wise score weightage followed by evaluation with timeframe and deadlines.Keywords: Engineering Accreditation Committee, Computing Accreditation Committee, performance indicator, Program Educational Objective, ABET Criterion 1 to 7, IEEE, National Board of Accreditation, MOOCS, Board of Studies, stakeholders, course objective, program outcome, articulation, attainment, CO-PO mapping, CO-PO-SO mapping, PDCA cycle, degree certificates, course files, course catalogue
Procedia PDF Downloads 59733 Video Games Technologies Approach for Their Use in the Classroom
Authors: Daniel Vargas-Herrera, Ivette Caldelas, Fernando Brambila-Paz, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a set of educational materials based on video games technologies. Essentially these materials correspond to projects developed and under development as bachelor thesis of some Computer Engineering students of the Engineering School. All materials are based on the Unity SDK; integrating some devices such as kinect, leap motion, oculus rift, data gloves and Google cardboard. In detail, we present a virtual reality application for neurosciences students (suitable for neural rehabilitation), and virtual scenes for the Google cardboard, which will be used by the psychology students for phobias treatment. The objective is these materials will be located at a server to be available for all students, in the classroom or in the cloud, considering the use of smartphones has been widely extended between students.Keywords: virtual reality, interactive technologies, video games, educational materials
Procedia PDF Downloads 657732 Monitoring Land Productivity Dynamics of Gombe State, Nigeria
Authors: Ishiyaku Abdulkadir, Satish Kumar J
Abstract:
Land Productivity is a measure of the greenness of above-ground biomass in health and potential gain and is not related to agricultural productivity. Monitoring land productivity dynamics is essential to identify, especially when and where the trend is characterized degraded for mitigation measures. This research aims to monitor the land productivity trend of Gombe State between 2001 and 2015. QGIS was used to compute NDVI from AVHRR/MODIS datasets in a cloud-based method. The result appears that land area with improving productivity account for 773sq.km with 4.31%, stable productivity traced to 4,195.6 sq.km with 23.40%, stable but stressed productivity represent 18.7sq.km account for 0.10%, early sign of decline productivity occupied 5203.1sq.km with 29%, declining productivity account for 7019.7sq.km, represent 39.2%, water bodies occupied 718.7sq.km traced to 4% of the state’s area.Keywords: above-ground biomass, dynamics, land productivity, man-environment relationship
Procedia PDF Downloads 145731 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment
Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan
Abstract:
With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.Keywords: data sharing, cross-domain, data exchange, publish-subscribe
Procedia PDF Downloads 124730 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques
Authors: Stefan K. Behfar
Abstract:
The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing
Procedia PDF Downloads 76729 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers
Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala
Abstract:
The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification
Procedia PDF Downloads 163728 Understanding Student Engagement through Sentiment Analytics of Response Times to Electronically Shared Feedback
Authors: Yaxin Bi, Peter Nicholl
Abstract:
The rapid advancement of Information and communication technologies (ICT) is extremely influencing every aspect of Higher Education. It has transformed traditional teaching, learning, assessment and feedback into a new era of Digital Education. This also introduces many challenges in capturing and understanding student engagement with their studies in Higher Education. The School of Computing at Ulster University has developed a Feedback And Notification (FAN) Online tool that has been used to send students links to personalized feedback on their submitted assessments and record students’ frequency of review of the shared feedback as well as the speed of collection. The feedback that the students initially receive is via a personal email directing them through to the feedback via a URL link that maps to the feedback created by the academic marker. This feedback is typically a Word or PDF report including comments and the final mark for the work submitted approximately three weeks before. When the student clicks on the link, the student’s personal feedback is viewable in the browser and they can view the contents. The FAN tool provides the academic marker with a report that includes when and how often a student viewed the feedback via the link. This paper presents an investigation into student engagement through analyzing the interaction timestamps and frequency of review by the student. We have proposed an approach to modeling interaction timestamps and use sentiment classification techniques to analyze the data collected over the last five years for a set of modules. The data studied is across a number of final years and second-year modules in the School of Computing. The paper presents the details of quantitative analysis methods and describes further their interactions with the feedback overtime on each module studied. We have projected the students into different groups of engagement based on sentiment analysis results and then provide a suggestion of early targeted intervention for the set of students seen to be under-performing via our proposed model.Keywords: feedback, engagement, interaction modelling, sentiment analysis
Procedia PDF Downloads 103727 Increasing Employee Productivity and Work Well-Being by Employing Affective Decision Support and a Knowledge-Based System
Authors: Loreta Kaklauskiene, Arturas Kaklauskas
Abstract:
This employee productivity and work well-being effective system aims to maximise the work performance of personnel and boost well-being in offices. Affective computing, decision support, and knowledge-based systems were used in our research. The basis of this effective system is our European Patent application (No: EP 4 020 134 A1) and two Lithuanian patents (LT 6841, LT 6866). Our study examines ways to support efficient employee productivity and well-being by employing mass-customised, personalised office environment. Efficient employee performance and well-being are managed by changing mass-customised office environment factors such as air pollution levels, humidity, temperature, data, information, knowledge, activities, lighting colours and intensity, scents, media, games, videos, music, and vibrations. These aspects of management generate a customised, adaptive environment for users taking into account their emotional, affective, and physiological (MAP) states measured and fed into the system. This research aims to develop an innovative method and system which would analyse, customise and manage a personalised office environment according to a specific user’s MAP states in a cohesive manner. Various values of work spaces (e.g., employee utilitarian, hedonic, perceived values) are also established throughout this process, based on the measurements that describe MAP states and other aspects related to the office environment. The main contribution of our research is the development of a real-time mass-customised office environment to boost employee performance and well-being. Acknowledgment: This work was supported by Project No. 2020-1-LT01-KA203-078100 “Minimizing the influence of coronavirus in a built environment” (MICROBE) from the European Union’s Erasmus + program.Keywords: effective decision support and a knowledge-based system, human resource management, employee productivity and work well-being, affective computing
Procedia PDF Downloads 109726 Extending Smart City Infrastructure to Cover Natural Disasters
Authors: Nina Dasari, Satvik Dasari
Abstract:
Smart city solutions are being developed across the globe to transform urban areas. However, the infrastructure enablement for alerting natural disasters such as floods and wildfires is deficient. This paper discusses an innovative device that could be used as part of the smart city initiative to detect and provide alerts in case of floods at road crossings and wildfires. An Internet of Things (IoT) smart city node was designed, tested, and deployed with collaboration from the City of Austin. The end to end solution includes a 3G enabled IoT device, flood and fire sensors, cloud, a mobile app, and IoT analytics. The real-time data was collected and analyzed using IoT analytics to refine the solution for the past year. The results demonstrate that the proposed solution is reliable and provides accurate results. This low-cost solution is viable, and it can replace the current solution which costs tens of thousands of dollars.Keywords: analytics, internet of things, natural disasters, smart city
Procedia PDF Downloads 224725 Bilateral Thalamic Hypodense Lesions in Computing Tomography
Authors: Angelis P. Barlampas
Abstract:
Purpose of Learning Objective: This case depicts the need for cooperation between the emergency department and the radiologist to achieve the best diagnostic result for the patient. The clinical picture must correlate well with the radiology report and when it does not, this is not necessarily someone’s fault. Careful interpretation and good knowledge of the limitations, advantages and disadvantages of each imaging procedure are essential for the final diagnostic goal. Methods or Background: A patient was brought to the emergency department by their relatives. He was suddenly confused and his mental status was altered. He hadn't any history of mental illness and was otherwise healthy. A computing tomography scan without contrast was done, but it was unremarkable. Because of high clinical suspicion of probable neurologic disease, he was admitted to the hospital. Results or Findings: Another T was done after 48 hours. It showed a hypodense region in both thalamic areas. Taking into account that the first CT was normal, but the initial clinical picture of the patient was alerting of something wrong, the repetitive CT exam is highly suggestive of a probable diagnosis of bilateral thalamic infractions. Differential diagnosis: Primary bilateral thalamic glioma, Wernicke encephalopathy, osmotic myelinolysis, Fabry disease, Wilson disease, Leigh disease, West Nile encephalitis, Greutzfeldt Jacob disease, top of the basilar syndrome, deep venous thrombosis, mild to moderate cerebral hypotension, posterior reversible encephalopathy syndrome, Neurofibromatosis type 1. Conclusion: As is the case of limitations for any imaging procedure, the same applies to CT. The acute ischemic attack can not depict on CT. A period of 24 to 48 hours has to elapse before any abnormality can be seen. So, despite the fact that there are no obvious findings of an ischemic episode, like paresis or imiparesis, one must be careful not to attribute the patient’s clinical signs to other conditions, such as toxic effects, metabolic disorders, psychiatric symptoms, etc. Further investigation with MRI or at least a repeated CT must be done.Keywords: CNS, CT, thalamus, emergency department
Procedia PDF Downloads 121724 Using Inertial Measurement Unit to Evaluate the Balance Ability of Hikers
Authors: Po-Chen Chen, Tsung-Han Yang, Zhi-Wei Zheng, Shih-Tsang Tang
Abstract:
Falls are the most common accidents during mountain hiking, especially in high-altitude environments with unstable terrain or adverse weather. Balance ability is a crucial factor in hiking, effectively ensuring hiking safety and reducing the risk of injuries. If balance ability can be assessed simply and effectively, hikers can identify their weaknesses and conduct targeted training to improve their balance ability, thereby reducing injury risks. With the widespread use of smartphones and their built-in inertial sensors, this project aims to develop a simple Inertial Measurement Unit (IMU) balance measurement technique based on smartphones. This will provide hikers with an easy-to-use, low-cost tool for assessing balance ability, monitoring training effects in real-time, and continuously tracking balance ability through uploading cloud data uploads, facilitating personal athletic performance.Keywords: balance, IMU, smartphone, wearable devices
Procedia PDF Downloads 37723 Active Learning Based on Science Experiments to Improve Scientific Literacy
Authors: Kunihiro Kamataki
Abstract:
In this study, active learning based on simple science experiments was developed in a university class of the freshman, in order to improve their scientific literacy. Through the active learning based on simple experiments of generation of cloud in a plastic bottle, students increased the interest in the global atmospheric problem and were able to discuss and find solutions about this problem positively from various viewpoints of the science technology, the politics, the economy, the diplomacy and the relations among nations. The results of their questionnaires and free descriptions of this class indicate that they improve the scientific literacy and motivations of other classroom lectures to acquire knowledge. It is thus suggested that the science experiment is strong tool to improve their intellectual curiosity rapidly and the connections that link the impression of science experiment and their interest of the social problem is very important to enhance their learning effect in this education.Keywords: active learning, scientific literacy, simple scientific experiment, university education
Procedia PDF Downloads 260722 Estimating Tree Height and Forest Classification from Multi Temporal Risat-1 HH and HV Polarized Satellite Aperture Radar Interferometric Phase Data
Authors: Saurav Kumar Suman, P. Karthigayani
Abstract:
In this paper the height of the tree is estimated and forest types is classified from the multi temporal RISAT-1 Horizontal-Horizontal (HH) and Horizontal-Vertical (HV) Polarised Satellite Aperture Radar (SAR) data. The novelty of the proposed project is combined use of the Back-scattering Coefficients (Sigma Naught) and the Coherence. It uses Water Cloud Model (WCM). The approaches use two main steps. (a) Extraction of the different forest parameter data from the Product.xml, BAND-META file and from Grid-xxx.txt file come with the HH & HV polarized data from the ISRO (Indian Space Research Centre). These file contains the required parameter during height estimation. (b) Calculation of the Vegetation and Ground Backscattering, Coherence and other Forest Parameters. (c) Classification of Forest Types using the ENVI 5.0 Tool and ROI (Region of Interest) calculation.Keywords: RISAT-1, classification, forest, SAR data
Procedia PDF Downloads 406