Search results for: elliptic curve digital signature algorithm
1226 Arduino Pressure Sensor Cushion for Tracking and Improving Sitting Posture
Authors: Andrew Hwang
Abstract:
The average American worker sits for thirteen hours a day, often with poor posture and infrequent breaks, which can lead to health issues and back problems. The Smart Cushion was created to alert individuals of their poor postures, and may potentially alleviate back problems and correct poor posture. The Smart Cushion is a portable, rectangular, foam cushion, with five strategically placed pressure sensors, that utilizes an Arduino Uno circuit board and specifically designed software, allowing it to collect data from the five pressure sensors and store the data on an SD card. The data is then compiled into graphs and compared to controlled postures. Before volunteers sat on the cushion, their levels of back pain were recorded on a scale from 1-10. Data was recorded for an hour during sitting, and then a new, corrected posture was suggested. After using the suggested posture for an hour, the volunteers described their level of discomfort on a scale from 1-10. Different patterns of sitting postures were generated that were able to serve as early warnings of potential back problems. By using the Smart Cushion, the areas where different volunteers were applying the most pressure while sitting could be identified, and the sitting postures could be corrected. Further studies regarding the relationships between posture and specific regions of the body are necessary to better understand the origins of back pain; however, the Smart Cushion is sufficient for correcting sitting posture and preventing the development of additional back pain.Keywords: Arduino Sketch Algorithm, biomedical technology, pressure sensors, Smart Cushion
Procedia PDF Downloads 1341225 Parametric Influence and Optimization of Wire-EDM on Oil Hardened Non-Shrinking Steel
Authors: Nixon Kuruvila, H. V. Ravindra
Abstract:
Wire-cut Electro Discharge Machining (WEDM) is a special form of conventional EDM process in which electrode is a continuously moving conductive wire. The present study aims at determining parametric influence and optimum process parameters of Wire-EDM using Taguchi’s Technique and Genetic algorithm. The variation of the performance parameters with machining parameters was mathematically modeled by Regression analysis method. The objective functions are Dimensional Accuracy (DA) and Material Removal Rate (MRR). Experiments were designed as per Taguchi’s L16 Orthogonal Array (OA) where in Pulse-on duration, Pulse-off duration, Current, Bed-speed and Flushing rate have been considered as the important input parameters. The matrix experiments were conducted for the material Oil Hardened Non Shrinking Steel (OHNS) having the thickness of 40 mm. The results of the study reveals that among the machining parameters it is preferable to go in for lower pulse-off duration for achieving over all good performance. Regarding MRR, OHNS is to be eroded with medium pulse-off duration and higher flush rate. Finally, the validation exercise performed with the optimum levels of the process parameters. The results confirm the efficiency of the approach employed for optimization of process parameters in this study.Keywords: dimensional accuracy (DA), regression analysis (RA), Taguchi method (TM), volumetric material removal rate (VMRR)
Procedia PDF Downloads 4091224 Taxonomic Study and Environmental Ecology of Parrot (Rose Ringed) in City Mirpurkhas, Sindh, Pakistan
Authors: Aisha Liaquat Ali, Ghulam Sarwar Gachal, Muhammad Yusuf Sheikh
Abstract:
The Parrot rose ringed (Psittaculla krameri) commonly known as Tota, belongs to the order ‘Psittaciformes’ and family ‘Psittacidea’. Its sub-species inhabiting Pakistan are Psittaculla borealis. The parrot rose-ringed has been categorized the least concern species, the core aim of the present study is to investigate the ecology and taxonomy of parrot (rose-ringed). Sampling was obtained for the taxonomic identification from various adjoining areas in City Mirpurkhas by non-random method, which was conducted from Feb to June 2017. The different parameters measured with the help of a vernier caliper, foot scale, digital weighing machine. Body parameters were measured via; length of body, length of the wings, length of tail, mass in grams. During present study, a total number of 36 specimens were collected from different localities of City Mirpurkhas (38.2%) were male and (62.7%) were female. Maximum population density of Psittaculla Krameri borealis (52.9%) was collected from Sindh Horticulture Research Station (fruit farm) Mirpurkhas. Minimum no: of Psittaculla krameri borealis (5.5%) collected in urban parks. It was observed that Psittaculla krameri borealis were in dense population during the months of ‘May’ and ‘June’ when the temperature ranged between 20°C and 45°C. A Psittaculla krameri borealis female was found the heaviest in body weight. The species of parrot (rose ringed) captured during study having green plumage, coverts were gray, upper beak, red and lower beak black, shorter tail in female long tail in the male which was similar to the Psittaculla krameri borealis.Keywords: Mirpurkhas Sindh Pakistan, environmental ecology, parrot, rose-ringed, taxonomy
Procedia PDF Downloads 1751223 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans
Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee
Abstract:
This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i. e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.Keywords: flexible job shop scheduling, decision tree, priority rules, case study
Procedia PDF Downloads 3581222 Impact of E-Resources and Its Acceessability by Faculty and Research Scholars of Academic Libraries: A Case Study
Authors: M. Jaculine Mary
Abstract:
Today electronic resources are considered as an integral part of information sources to impart efficient services to the people aspiring to acquire knowledge in different fields. E-resources are those resources which include documents in e-format that can be accessed via the Internet in a digital library environment. The present study focuses on accessibility and use of e-resources by faculty and research scholars of academic libraries of Coimbatore, TamilNadu, India. The main objectives are to identify their purpose of using e-resources, know the users’ Information and Communication Technology (ICT) skills, identify satisfaction level of availability of e-resources, use of different e-resources, overall user satisfaction of using e-resources, impact of e-resources on their research and problems faced by them in the access of e-resources. The research methodology adopted to collect data for this study includes analysis of survey reports carried out by distributing questionnaires to the users. The findings of the research are based on the study of responses received from questionnaires distributed to a sample population of 200 users. Among the 200 respondents, 55 percent of research students and 45 percent of faculty members were users of e-resources. It was found that a majority of the users agreed that relevant, updated information at a fast pace had influenced them to use e-resources. Most of the respondents were of the view that more numbers of computers in the library would facilitate quick learning. Academic libraries have to take steps to arrange various training and orientation programmes for research students and faculty members to use the availability of e-resources. This study helps the librarian in planning and development of e-resources to provide modern services to their users of libraries. The study recommends that measures should be taken to increase the accessibility level of e-resource services among the information seekers for increasing the best usage of available electronic resources in the academic libraries.Keywords: academic libraries, accessibility, electronic resources, satisfaction level, survey
Procedia PDF Downloads 1421221 Integrated Gas Turbine Performance Diagnostics and Condition Monitoring Using Adaptive GPA
Authors: Yi-Guang Li, Suresh Sampath
Abstract:
Gas turbine performance degrades over time, and the degradation is greatly affected by environmental, ambient, and operating conditions. The engines may degrade slowly under favorable conditions and result in a waste of engine life if a scheduled maintenance scheme is followed. They may also degrade fast and fail before a scheduled overhaul if the conditions are unfavorable, resulting in serious secondary damage, loss of engine availability, and increased maintenance costs. To overcome these problems, gas turbine owners are gradually moving from scheduled maintenance to condition-based maintenance, where condition monitoring is one of the key supporting technologies. This paper presents an integrated adaptive GPA diagnostics and performance monitoring system developed at Cranfield University for gas turbine gas path condition monitoring. It has the capability to predict the performance degradation of major gas path components of gas turbine engines, such as compressors, combustors, and turbines, using gas path measurement data. It is also able to predict engine key performance parameters for condition monitoring, such as turbine entry temperature that cannot be directly measured. The developed technology has been implemented into digital twin computer Software, Pythia, to support the condition monitoring of gas turbine engines. The capabilities of the integrated GPA condition monitoring system are demonstrated in three test cases using a model gas turbine engine similar to the GE aero-derivative LM2500 engine widely used in power generation and marine propulsion. It shows that when the compressor of the model engine degrades, the Adaptive GPA is able to predict the degradation and the changing engine performance accurately using gas path measurements. Such a presented technology and software are generic, can be applied to different types of gas turbine engines, and provide crucial engine health and performance parameters to support condition monitoring and condition-based maintenance.Keywords: gas turbine, adaptive GPA, performance, diagnostics, condition monitoring
Procedia PDF Downloads 881220 Training Undergraduate Engineering Students in Robotics and Automation through Model-Based Design Training: A Case Study at Assumption University of Thailand
Authors: Sajed A. Habib
Abstract:
Problem-based learning (PBL) is a student-centered pedagogy that originated in the medical field and has also been used extensively in other knowledge disciplines with recognized advantages and limitations. PBL has been used in various undergraduate engineering programs with mixed outcomes. The current fourth industrial revolution (digital era or Industry 4.0) has made it essential for many science and engineering students to receive effective training in advanced courses such as industrial automation and robotics. This paper presents a case study at Assumption University of Thailand, where a PBL-like approach was used to teach some aspects of automation and robotics to selected groups of undergraduate engineering students. These students were given some basic level training in automation prior to participating in a subsequent training session in order to solve technical problems with increased complexity. The participating students’ evaluation of the training sessions in terms of learning effectiveness, skills enhancement, and incremental knowledge following the problem-solving session was captured through a follow-up survey consisting of 14 questions and a 5-point scoring system. From the most recent training event, an overall 70% of the respondents indicated that their skill levels were enhanced to a much greater level than they had had before the training, whereas 60.4% of the respondents from the same event indicated that their incremental knowledge following the session was much greater than what they had prior to the training. The instructor-facilitator involved in the training events suggested that this method of learning was more suitable for senior/advanced level students than those at the freshmen level as certain skills to effectively participate in such problem-solving sessions are acquired over a period of time, and not instantly.Keywords: automation, industry 4.0, model-based design training, problem-based learning
Procedia PDF Downloads 1341219 Blue Whale Body Condition from Photographs Taken over a 14-Year Period in the North East Pacific: Annual Variations and Connection to Measures of Ocean Productivity
Authors: Rachel Wachtendonk, John Calambokidis, Kiirsten Flynn
Abstract:
Large marine mammals can serve as an indicator of the overall state of the environment due to their long lifespan and apex position in marine food webs. Reductions in prey, driven by changes in environmental conditions can have resounding impacts on the trophic system as a whole; this can manifest in reduced fat stores that are visible on large whales. Poor health can lead to reduced survivorship and fitness, both of which can be detrimental to a recovering population. A non-invasive technique was used for monitoring blue whale health and for seeing if it changes with ocean conditions. Digital photographs of blue whales taken in the NE Pacific by Cascadia Research and collaborators from 2005-2018 (n=3,545) were scored for overall body condition based on visible vertebrae and body shape on a scale of 0-3 where a score of 0 indicated best body condition and a score of 3 indicated poorest. The data was analyzed to determine if there were patterns in the health of whales across years and whether overall poor health was related to oceanographic conditions and predictors of prey abundance on the California coast. The year was a highly significant factor in body condition (Chi-Square, p<0.001). The proportion of whales showing poor body condition (scores 2 & 3) overall was 33% but by year varied widely from a low of 18% (2008) to a high of 55% (2015). The only two years where >50% of animals had poor body condition were 2015 and 2017 (no other year was above 45%). The 2015 maximum proportion of whales in poor body condition coincide with the marine heat wave that affected the NE Pacific 2014-16 and impacted other whale populations. This indicates that the scoring method was an effective way to evaluate blue whale health and how they respond to a changing ocean.Keywords: blue whale, body condition, environmental variability, photo-identification
Procedia PDF Downloads 2041218 Optimization of Proton Exchange Membrane Fuel Cell Parameters Based on Modified Particle Swarm Algorithms
Authors: M. Dezvarei, S. Morovati
Abstract:
In recent years, increasing usage of electrical energy provides a widespread field for investigating new methods to produce clean electricity with high reliability and cost management. Fuel cells are new clean generations to make electricity and thermal energy together with high performance and no environmental pollution. According to the expansion of fuel cell usage in different industrial networks, the identification and optimization of its parameters is really significant. This paper presents optimization of a proton exchange membrane fuel cell (PEMFC) parameters based on modified particle swarm optimization with real valued mutation (RVM) and clonal algorithms. Mathematical equations of this type of fuel cell are presented as the main model structure in the optimization process. Optimized parameters based on clonal and RVM algorithms are compared with the desired values in the presence and absence of measurement noise. This paper shows that these methods can improve the performance of traditional optimization methods. Simulation results are employed to analyze and compare the performance of these methodologies in order to optimize the proton exchange membrane fuel cell parameters.Keywords: clonal algorithm, proton exchange membrane fuel cell (PEMFC), particle swarm optimization (PSO), real-valued mutation (RVM)
Procedia PDF Downloads 3511217 Metagenomics-Based Molecular Epidemiology of Viral Diseases
Authors: Vyacheslav Furtak, Merja Roivainen, Olga Mirochnichenko, Majid Laassri, Bella Bidzhieva, Tatiana Zagorodnyaya, Vladimir Chizhikov, Konstantin Chumakov
Abstract:
Molecular epidemiology and environmental surveillance are parts of a rational strategy to control infectious diseases. They have been widely used in the worldwide campaign to eradicate poliomyelitis, which otherwise would be complicated by the inability to rapidly respond to outbreaks and determine sources of the infection. The conventional scheme involves isolation of viruses from patients and the environment, followed by their identification by nucleotide sequences analysis to determine phylogenetic relationships. This is a tedious and time-consuming process that yields definitive results when it may be too late to implement countermeasures. Because of the difficulty of high-throughput full-genome sequencing, most such studies are conducted by sequencing only capsid genes or their parts. Therefore the important information about the contribution of other parts of the genome and inter- and intra-species recombination to viral evolution is not captured. Here we propose a new approach based on the rapid concentration of sewage samples with tangential flow filtration followed by deep sequencing and reconstruction of nucleotide sequences of viruses present in the samples. The entire nucleic acids content of each sample is sequenced, thus preserving in digital format the complete spectrum of viruses. A set of rapid algorithms was developed to separate deep sequence reads into discrete populations corresponding to each virus and assemble them into full-length consensus contigs, as well as to generate a complete profile of sequence heterogeneities in each of them. This provides an effective approach to study molecular epidemiology and evolution of natural viral populations.Keywords: poliovirus, eradication, environmental surveillance, laboratory diagnosis
Procedia PDF Downloads 2811216 Fraud Detection in Credit Cards with Machine Learning
Authors: Anjali Chouksey, Riya Nimje, Jahanvi Saraf
Abstract:
Online transactions have increased dramatically in this new ‘social-distancing’ era. With online transactions, Fraud in online payments has also increased significantly. Frauds are a significant problem in various industries like insurance companies, baking, etc. These frauds include leaking sensitive information related to the credit card, which can be easily misused. Due to the government also pushing online transactions, E-commerce is on a boom. But due to increasing frauds in online payments, these E-commerce industries are suffering a great loss of trust from their customers. These companies are finding credit card fraud to be a big problem. People have started using online payment options and thus are becoming easy targets of credit card fraud. In this research paper, we will be discussing machine learning algorithms. We have used a decision tree, XGBOOST, k-nearest neighbour, logistic-regression, random forest, and SVM on a dataset in which there are transactions done online mode using credit cards. We will test all these algorithms for detecting fraud cases using the confusion matrix, F1 score, and calculating the accuracy score for each model to identify which algorithm can be used in detecting frauds.Keywords: machine learning, fraud detection, artificial intelligence, decision tree, k nearest neighbour, random forest, XGBOOST, logistic regression, support vector machine
Procedia PDF Downloads 1481215 Exploring the Role of Media Activity Theory as a Conceptual Basis for Advancing Journalism Education: A Comprehensive Analysis of Its Impact on News Production and Consumption in the Digital Age
Authors: Shohnaza Uzokova Beknazarovna
Abstract:
This research study provides a comprehensive exploration of the Theory of Media Activity and its relevance as a conceptual framework for journalism education. The author offers a thorough review of existing literature on media activity theory, emphasizing its potential to enhance the understanding of the evolving media landscape and its implications for journalism practice. Through a combination of theoretical analysis and practical examples, the paper elucidates the ways in which the Theory of Media Activity can inform and enrich journalism education, particularly in relation to the interactive and participatory nature of contemporary media. The author presents a compelling argument for the integration of media activity theory into journalism curricula, emphasizing its capacity to equip students with a nuanced understanding of the reciprocal relationship between media producers and consumers. Furthermore, the paper discusses the implications of technological advancements on media production and consumption, highlighting the need for journalism educators to prepare students to navigate and contribute to the future of journalism in a rapidly changing media environment. Overall, this research paper offers valuable insights into the potential benefits of embracing the Theory of Media Activity as a foundational framework for journalism education. Its thorough analysis and practical implications make it a valuable resource for educators, researchers, and practitioners seeking to enhance journalism pedagogy in response to the dynamic nature of contemporary media.Keywords: theory of media activity, journalism education, media landscape, media production, media consumption, interactive media, participatory media, technological advancements, media producers, media consumers, journalism practice, contemporary media environment, journalism pedagogy, media theory, media studies
Procedia PDF Downloads 471214 Traffic Forecasting for Open Radio Access Networks Virtualized Network Functions in 5G Networks
Authors: Khalid Ali, Manar Jammal
Abstract:
In order to meet the stringent latency and reliability requirements of the upcoming 5G networks, Open Radio Access Networks (O-RAN) have been proposed. The virtualization of O-RAN has allowed it to be treated as a Network Function Virtualization (NFV) architecture, while its components are considered Virtualized Network Functions (VNFs). Hence, intelligent Machine Learning (ML) based solutions can be utilized to apply different resource management and allocation techniques on O-RAN. However, intelligently allocating resources for O-RAN VNFs can prove challenging due to the dynamicity of traffic in mobile networks. Network providers need to dynamically scale the allocated resources in response to the incoming traffic. Elastically allocating resources can provide a higher level of flexibility in the network in addition to reducing the OPerational EXpenditure (OPEX) and increasing the resources utilization. Most of the existing elastic solutions are reactive in nature, despite the fact that proactive approaches are more agile since they scale instances ahead of time by predicting the incoming traffic. In this work, we propose and evaluate traffic forecasting models based on the ML algorithm. The algorithms aim at predicting future O-RAN traffic by using previous traffic data. Detailed analysis of the traffic data was carried out to validate the quality and applicability of the traffic dataset. Hence, two ML models were proposed and evaluated based on their prediction capabilities.Keywords: O-RAN, traffic forecasting, NFV, ARIMA, LSTM, elasticity
Procedia PDF Downloads 2261213 The Hospitals Residents Problem with Bounded Length Preference List under Social Stability
Authors: Ashish Shrivastava, C. Pandu Rangan
Abstract:
In this paper, we consider The Hospitals Residents problem with Social Stability (HRSS), where hospitals and residents can communicate only through the underlying social network. Those residents and hospitals which don not have any social connection between them can not communicate and hence they cannot be a social blocking pair with respect to a socially stable matching in an instance of hospitals residents problem with social stability. In large scale matching like NRMP or Scottish medical matching scheme etc. where set of agents, as well as length of preference lists, are very large, social stability is a useful notion in which members of a blocking pair could block a matching if and only if they know the existence of each other. Thus the notion of social stability in hospitals residents problem allows us to increase the cardinality of the matching without taking care of those blocking pairs which are not socially connected to each other. We know that finding a maximum cardinality socially stable matching, in an instance, of HRSS is NP-hard. This motivates us to solve this problem with bounded length preference lists on one side. In this paper, we have presented a polynomial time algorithm to compute maximum cardinality socially stable matching in a HRSS instance where residents can give at most two length and hospitals can give unbounded length preference list. Preference lists of residents and hospitals will be strict in nature.Keywords: matching under preference, socially stable matching, the hospital residents problem, the stable marriage problem
Procedia PDF Downloads 2771212 Citizen Journalist: A Case Study of Audience Participation in Mainstream TV News Production in India
Authors: Sindhu Manjesh
Abstract:
This paper examines citizen journalism in India, specifically the inclusion of user-generated content (UGC) by mainstream media, by focusing on the case study of the Citizen Journalist show on CNN-News 18, a national television news broadcaster. It studies the processes of production involved in Citizen Journalist to find out how professional journalists and citizens interact to put together the show in order to help readers understand the relationship between journalists and the public in the evolving media landscape of India, the world’s largest democracy, and a leader in the Global South. Using an in-depth case study approach involving newsroom ethnography, interviews, and an examination of Citizen Journalist content, it studies the implications of audience participation for traditional journalistic routines and values – specifically gatekeeping and objectivity. Citizen Journalist began to much fanfare and promise about including neglected citizen views and voices. Based on evidence gathered, this study, however, argues that claims made by CNN-News18 about democratizing news production through Citizen Journalist were overstated. It made some effort to do this and broadcast a lot of important stories. But overall, in terms of bringing in citizen voices, it did not live up to its initial promise because the show was anchored in traditional journalistic norms and roles and the channel’s economic imperatives. Professional journalists were ironically the producers of 'citizen journalism' in this case. Mainstream media’s authority in defining journalistic work –who says what, where, when, why, and how– remains predominant in India. This has implications for democratic participation in India. The example of Citizen Journalist –the model it followed, its partial success, and many limitations– could well presage outcomes for other news outlets, in India and beyond, which copy its template.Keywords: citizen journalism, digital journalism, participatory journalism, public sphere
Procedia PDF Downloads 1191211 A Geometric Interpolation Scheme in Overset Meshes for the Piecewise Linear Interface Calculation Volume of Fluid Method in Multiphase Flows
Authors: Yanni Chang, Dezhi Dai, Albert Y. Tong
Abstract:
Piecewise linear interface calculation (PLIC) schemes are widely used in the volume-of-fluid (VOF) method to capture interfaces in numerical simulations of multiphase flows. Dynamic overset meshes can be especially useful in applications involving component motions and complex geometric shapes. In the present study, the VOF value of an acceptor cell is evaluated in a geometric way that transfers the fraction field between the meshes precisely with reconstructed interfaces from the corresponding donor elements. The acceptor cell value is evaluated by using a weighted average of its donors for most of the overset interpolation schemes for continuous flow variables. The weighting factors are obtained by different algebraic methods. Unlike the continuous flow variables, the VOF equation is a step function near the interfaces, which ranges from zero to unity rapidly. A geometric interpolation scheme of the VOF field in overset meshes for the PLIC-VOF method has been proposed in the paper. It has been tested successfully in quadrilateral/hexahedral overset meshes by employing several VOF advection tests with imposed solenoidal velocity fields. The proposed algorithm has been shown to yield higher accuracy in mass conservation and interface reconstruction compared with three other algebraic ones.Keywords: interpolation scheme, multiphase flows, overset meshes, PLIC-VOF method
Procedia PDF Downloads 1761210 Detecting Geographically Dispersed Overlay Communities Using Community Networks
Authors: Madhushi Bandara, Dharshana Kasthurirathna, Danaja Maldeniya, Mahendra Piraveenan
Abstract:
Community detection is an extremely useful technique in understanding the structure and function of a social network. Louvain algorithm, which is based on Newman-Girman modularity optimization technique, is extensively used as a computationally efficient method extract the communities in social networks. It has been suggested that the nodes that are in close geographical proximity have a higher tendency of forming communities. Variants of the Newman-Girman modularity measure such as dist-modularity try to normalize the effect of geographical proximity to extract geographically dispersed communities, at the expense of losing the information about the geographically proximate communities. In this work, we propose a method to extract geographically dispersed communities while preserving the information about the geographically proximate communities, by analyzing the ‘community network’, where the centroids of communities would be considered as network nodes. We suggest that the inter-community link strengths, which are normalized over the community sizes, may be used to identify and extract the ‘overlay communities’. The overlay communities would have relatively higher link strengths, despite being relatively apart in their spatial distribution. We apply this method to the Gowalla online social network, which contains the geographical signatures of its users, and identify the overlay communities within it.Keywords: social networks, community detection, modularity optimization, geographically dispersed communities
Procedia PDF Downloads 2351209 Self-Tuning Dead-Beat PD Controller for Pitch Angle Control of a Bench-Top Helicopter
Authors: H. Mansor, S.B. Mohd-Noor, N. I. Othman, N. Tazali, R. I. Boby
Abstract:
This paper presents an improved robust Proportional Derivative controller for a 3-Degree-of-Freedom (3-DOF) bench-top helicopter by using adaptive methodology. Bench-top helicopter is a laboratory scale helicopter used for experimental purposes which is widely used in teaching laboratory and research. Proportional Derivative controller has been developed for a 3-DOF bench-top helicopter by Quanser. Experiments showed that the transient response of designed PD controller has very large steady state error i.e., 50%, which is very serious. The objective of this research is to improve the performance of existing pitch angle control of PD controller on the bench-top helicopter by integration of PD controller with adaptive controller. Usually standard adaptive controller will produce zero steady state error; however response time to reach desired set point is large. Therefore, this paper proposed an adaptive with deadbeat algorithm to overcome the limitations. The output response that is fast, robust and updated online is expected. Performance comparisons have been performed between the proposed self-tuning deadbeat PD controller and standard PD controller. The efficiency of the self-tuning dead beat controller has been proven from the tests results in terms of faster settling time, zero steady state error and capability of the controller to be updated online.Keywords: adaptive control, deadbeat control, bench-top helicopter, self-tuning control
Procedia PDF Downloads 3251208 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition
Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar
Abstract:
In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers
Procedia PDF Downloads 451207 Numerical Investigation of the Influence on Buckling Behaviour Due to Different Launching Bearings
Authors: Nadine Maier, Martin Mensinger, Enea Tallushi
Abstract:
In general, today, two types of launching bearings are used in the construction of large steel and steel concrete composite bridges. These are sliding rockers and systems with hydraulic bearings. The advantages and disadvantages of the respective systems are under discussion. During incremental launching, the center of the webs of the superstructure is not perfectly in line with the center of the launching bearings due to unavoidable tolerances, which may have an influence on the buckling behavior of the web plates. These imperfections are not considered in the current design against plate buckling, according to DIN EN 1993-1-5. It is therefore investigated whether the design rules have to take into account any eccentricities which occur during incremental launching and also if this depends on the respective launching bearing. Therefore, at the Technical University Munich, large-scale buckling tests were carried out on longitudinally stiffened plates under biaxial stresses with the two different types of launching bearings and eccentric load introduction. Based on the experimental results, a numerical model was validated. Currently, we are evaluating different parameters for both types of launching bearings, such as load introduction length, load eccentricity, the distance between longitudinal stiffeners, the position of the rotation point of the spherical bearing, which are used within the hydraulic bearings, web, and flange thickness and imperfections. The imperfection depends on the geometry of the buckling field and whether local or global buckling occurs. This and also the size of the meshing is taken into account in the numerical calculations of the parametric study. As a geometric imperfection, the scaled first buckling mode is applied. A bilinear material curve is used so that a GMNIA analysis is performed to determine the load capacity. Stresses and displacements are evaluated in different directions, and specific stress ratios are determined at the critical points of the plate at the time of the converging load step. To evaluate the load introduction of the transverse load, the transverse stress concentration is plotted on a defined longitudinal section on the web. In the same way, the rotation of the flange is evaluated in order to show the influence of the different degrees of freedom of the launching bearings under eccentric load introduction and to be able to make an assessment for the case, which is relevant in practice. The input and the output are automatized and depend on the given parameters. Thus we are able to adapt our model to different geometric dimensions and load conditions. The programming is done with the help of APDL and a Python code. This allows us to evaluate and compare more parameters faster. Input and output errors are also avoided. It is, therefore, possible to evaluate a large spectrum of parameters in a short time, which allows a practical evaluation of different parameters for buckling behavior. This paper presents the results of the tests as well as the validation and parameterization of the numerical model and shows the first influences on the buckling behavior under eccentric and multi-axial load introduction.Keywords: buckling behavior, eccentric load introduction, incremental launching, large scale buckling tests, multi axial stress states, parametric numerical modelling
Procedia PDF Downloads 1071206 A Novel Methodology for Browser Forensics to Retrieve Searched Keywords from Windows 10 Physical Memory Dump
Authors: Dija Sulekha
Abstract:
Nowadays, a good percentage of reported cybercrimes involve the usage of the Internet, directly or indirectly for committing the crime. Usually, Web Browsers leave traces of browsing activities on the host computer’s hard disk, which can be used by investigators to identify internet-based activities of the suspect. But criminals, who involve in some organized crimes, disable browser file generation feature to hide the evidence while doing illegal activities through the Internet. In such cases, even though browser files were not generated in the storage media of the system, traces of recent and ongoing activities were generated in the Physical Memory of the system. As a result, the analysis of Physical Memory Dump collected from the suspect's machine retrieves lots of forensically crucial information related to the browsing history of the Suspect. This information enables the cyber forensic investigators to concentrate on a few highly relevant selected artefacts while doing the Offline Forensics analysis of storage media. This paper addresses the reconstruction of web browsing activities by conducting live forensics to identify searched terms, downloaded files, visited sites, email headers, email ids, etc. from the physical memory dump collected from Windows 10 Systems. Well-known entry points are available for retrieving all the above artefacts except searched terms. The paper describes a novel methodology to retrieve the searched terms from Windows 10 Physical Memory. The searched terms retrieved in this way can be used for doing advanced file and keyword search in the storage media files reconstructed from the file system recovery in offline forensics.Keywords: browser forensics, digital forensics, live Forensics, physical memory forensics
Procedia PDF Downloads 1161205 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution
Authors: Najrullah Khan, Athar Ali Khan
Abstract:
The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation
Procedia PDF Downloads 5351204 Need for Privacy in the Technological Era: An Analysis in the Indian Perspective
Authors: Amrashaa Singh
Abstract:
In the digital age and the large cyberspace, Data Protection and Privacy have become major issues in this technological era. There was a time when social media and online shopping websites were treated as a blessing for the people. But now the tables have turned, and the people have started to look at them with suspicion. They are getting aware of the privacy implications, and they do not feel as safe as they used to initially. When Edward Snowden informed the world about the snooping United States Security Agencies had been doing, that is when the picture became clear for the people. After the Cambridge Analytica case where the data of Facebook users were stored without their consent, the doubts arose in the minds of people about how safe they actually are. In India, the case of spyware Pegasus also raised a lot of concerns. It was used to snoop on a lot of human right activists and lawyers and the company which invented the spyware claims that it only sells it to the government. The paper will be dealing with the privacy concerns in the Indian perspective with an analytical methodology. The Supreme Court here had recently declared a right to privacy a Fundamental Right under Article 21 of the Constitution of India. Further, the Government is also working on the Data Protection Bill. The point to note is that India is still a developing country, and with the bill, the government aims at data localization. But there are doubts in the minds of many people that the Government would actually be snooping on the data of the individuals. It looks more like an attempt to curb dissenters ‘lawfully’. The focus of the paper would be on these issues in India in light of the European Union (EU) General Data Protection Regulation (GDPR). The Indian Data Protection Bill is also said to be loosely based on EU GDPR. But how helpful would these laws actually be is another concern since the economic and social conditions in both countries are very different? The paper aims at discussing these concerns, how good or bad is the intention of the government behind the bill, and how the nations can act together and draft common regulations so that there is some uniformity in the laws and their application.Keywords: Article 21, data protection, dissent, fundamental right, India, privacy
Procedia PDF Downloads 1141203 Numerical Investigation of Beam-Columns Subjected to Non-Proportional Loadings under Ambient Temperature Conditions
Authors: George Adomako Kumi
Abstract:
The response of structural members, when subjected to various forms of non-proportional loading, plays a major role in the overall stability and integrity of a structure. This research seeks to present the outcome of a finite element investigation conducted by the use of finite element programming software ABAQUS to validate the experimental results of elastic and inelastic behavior and strength of beam-columns subjected to axial loading, biaxial bending, and torsion under ambient temperature conditions. The application of the rigorous and highly complicated ABAQUS finite element software will seek to account for material, non-linear geometry, deformations, and, more specifically, the contact behavior between the beam-columns and support surfaces. Comparisons of the three-dimensional model with the results of actual tests conducted and results from a solution algorithm developed through the use of the finite difference method will be established in order to authenticate the veracity of the developed model. The results of this research will seek to provide structural engineers with much-needed knowledge about the behavior of steel beam columns and their response to various non-proportional loading conditions under ambient temperature conditions.Keywords: beam-columns, axial loading, biaxial bending, torsion, ABAQUS, finite difference method
Procedia PDF Downloads 1801202 Numerical Analysis of a Pilot Solar Chimney Power Plant
Authors: Ehsan Gholamalizadeh, Jae Dong Chung
Abstract:
Solar chimney power plant is a feasible solar thermal system which produces electricity from the Sun. The objective of this study is to investigate buoyancy-driven flow and heat transfer through a built pilot solar chimney system called 'Kerman Project'. The system has a chimney with the height and diameter of 60 m and 3 m, respectively, and the average radius of its solar collector is about 20 m, and also its average collector height is about 2 m. A three-dimensional simulation was conducted to analyze the system, using computational fluid dynamics (CFD). In this model, radiative transfer equation was solved using the discrete ordinates (DO) radiation model taking into account a non-gray radiation behavior. In order to modelling solar irradiation from the sun’s rays, the solar ray tracing algorithm was coupled to the computation via a source term in the energy equation. The model was validated with comparing to the experimental data of the Manzanares prototype and also the performance of the built pilot system. Then, based on the numerical simulations, velocity and temperature distributions through the system, the temperature profile of the ground surface and the system performance were presented. The analysis accurately shows the flow and heat transfer characteristics through the pilot system and predicts its performance.Keywords: buoyancy-driven flow, computational fluid dynamics, heat transfer, renewable energy, solar chimney power plant
Procedia PDF Downloads 2621201 Morphometric Study of Human Anterior and Posterior Meniscofemoral Ligaments of the Knee Joint on Thiel Embalmed Cadavers
Authors: Mohammad Alobaidy, David Nicoll, Tracey Wilkinson
Abstract:
Background: Many patients suffer postoperative knee stability after total knee arthroplasty (joint replacement) involving posterior cruciate ligament (PCL) sacrificing or retaining, but is not clear whether the meniscofemoral ligaments (MFLs) are retained during these procedures; their function in terms of knee stability is not well established in the literature. Purpose: Macroscopic, detailed, morphometric investigation of the anterior and posterior MFLs of the knee joint was undertaken to assist understanding of knee stability after total knee arthroplasty and ligament reconstruction. Methods: Dissection of eighty Thiel embalmed knees from 19 male and 21 female cadavers was conducted, mean age 77 (range 47-99 years). The origin and insertion of the anterior and posterior MFLs were measured using high accuracy, calibrated, digital Vernier calipers at 0.01mm. Results: The means were: anterior meniscofemoral ligament (aMFL) length 28.4 ± 2.7mm; posterior meniscofemoral ligament (pMFL) length 29 ± 3.7mm; aMFL femoral width 6.4 ± 1.7mm, mid-distance ligament width 4 ± 1.1mm, meniscal ligament width 3.9 ± 1.2mm; pMFL femoral width 5.6 ± 1.5mm, mid-distance ligament width 4.1 ± 1.1mm, meniscal ligament width 4.1 ± 1.3mm. Some of the male measurements were larger than female, with significant differences in the length of the aMFL femoral length p<0.01 and pMFL femoral length p<0.007, and width of the pMFL mid-distance p<0.04. Conclusion: This study may help explore the role of the meniscofemoral ligaments in knee stability after total knee arthroplasty with a posterior cruciate ligament retaining prosthesis. Anatomical information for Thiel embalmed knees may aid orthopaedic surgeons in ligament reconstruction.Keywords: anterior and posterior meniscofemoral ligaments, morphometric analysis, Thiel embalmed knees, knee arthroplasty
Procedia PDF Downloads 3771200 A Policy Strategy for Building Energy Data Management in India
Authors: Shravani Itkelwar, Deepak Tewari, Bhaskar Natarajan
Abstract:
The energy consumption data plays a vital role in energy efficiency policy design, implementation, and impact assessment. Any demand-side energy management intervention's success relies on the availability of accurate, comprehensive, granular, and up-to-date data on energy consumption. The Building sector, including residential and commercial, is one of the largest consumers of energy in India after the Industrial sector. With economic growth and increasing urbanization, the building sector is projected to grow at an unprecedented rate, resulting in a 5.6 times escalation in energy consumption till 2047 compared to 2017. Therefore, energy efficiency interventions will play a vital role in decoupling the floor area growth and associated energy demand, thereby increasing the need for robust data. In India, multiple institutions are involved in the collection and dissemination of data. This paper focuses on energy consumption data management in the building sector in India for both residential and commercial segments. It evaluates the robustness of data available through administrative and survey routes to estimate the key performance indicators and identify critical data gaps for making informed decisions. The paper explores several issues in the data, such as lack of comprehensiveness, non-availability of disaggregated data, the discrepancy in different data sources, inconsistent building categorization, and others. The identified data gaps are justified with appropriate examples. Moreover, the paper prioritizes required data in order of relevance to policymaking and groups it into "available," "easy to get," and "hard to get" categories. The paper concludes with recommendations to address the data gaps by leveraging digital initiatives, strengthening institutional capacity, institutionalizing exclusive building energy surveys, and standardization of building categorization, among others, to strengthen the management of building sector energy consumption data.Keywords: energy data, energy policy, energy efficiency, buildings
Procedia PDF Downloads 1851199 New Stratigraphy Profile of Classic Nihewan Basin Beds, Hebei, Northern China
Authors: Arya Farjand
Abstract:
The Nihewan Basin is a critical region in order to understand the Plio-Pleistocene paleoenvironment and its fauna in Northern China. The rich fossiliferous, fluvial-lacustrine sediments around the Nihewan Village hosted the specimens known as the Classic Nihewan Fauna. The primary excavations in the early 1920-30s produced more than 2000 specimens housed in Tianjin and Paris Museum. Nevertheless, the exact locality of excavations, fossil beds, and the reliable ages remained ambiguous until recent paleomagnetic studies and extensive work in conjunction sites. In this study, for the first time, we successfully relocated some of the original excavation sites. We reexamined more than 1500 specimens held in Tianjin Museum and cited their locality numbers and properties. During the field-season of 2017-2019, we visited the Xiashagou Valley. By reading the descriptions of the original site, utilization of satellite pictures, and comparing them with the current geomorphology of the area, we ensured the exact location of 26 of these sites and 17 fossil layers. Furthermore, by applying the latest technologies, such as GPS, Compass, digital barometers, laser measurer, and Abney level, we ensured the accuracy of the measurement. We surveyed 133-meter thickness of the deposits. Ultimately by applying the available Paleomagnetic data for this section, we estimated the ages of different horizons. The combination of our new data and previously published researches present a unique age control for the Classic Nihewan Fauna. These findings prove the hypothesis in which the Classic Nihewan Fauna belongs to different horizons, ranging from before Reunion up to after Olduvai earth geomagnetic field excursion (2.2-1.7 Mya).Keywords: classic Nihewan basin fauna, Olduvai excursion, Pleistocene, stratigraphy
Procedia PDF Downloads 1411198 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines
Authors: Shahrokh Barati, Reza Ramezani
Abstract:
Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy
Procedia PDF Downloads 4001197 ID + PD: Training Instructional Designers to Foster and Facilitate Learning Communities in Digital Spaces
Authors: Belkis L. Cabrera
Abstract:
Contemporary technological innovations have reshaped possibility, interaction, communication, engagement, education, and training. Indeed, today, a high-quality technology enhanced learning experience can be transformative as much for the learner as for the educator-trainer. As innovative technologies continue to facilitate, support, foster, and enhance collaboration, problem-solving, creativity, adaptiveness, multidisciplinarity, and communication, the field of instructional design (ID) also continues to develop and expand. Shifting its focus from media to the systematic design of instruction, or rather from the gadgets and devices themselves to the theories, models, and impact of implementing educational technology, the evolution of ID marks a restructuring of the teaching, learning, and training paradigms. However, with all of its promise, this latter component of ID remains underdeveloped. The majority of ID models are crafted and guided by learning theories and, therefore, most models are constructed around student and educator roles rather than trainer roles. Thus, when these models or systems are employed for training purposes, they usually have to be re-fitted, tweaked, and stretched to meet the training needs. This paper is concerned with the training or professional development (PD) facet of instructional design and how ID models built on teacher-to-teacher interaction and dialogue can support the creation of professional learning communities (PLCs) or communities of practice (CoPs), which can augment learning and PD experiences for all. Just as technology is changing the face of education, so too can it change the face of PD within the educational realm. This paper not only provides a new ID model but using innovative technologies such as Padlet and Thinkbinder, this paper presents a concrete example of how a traditional body-to-body, brick, and mortar learning community can be transferred and transformed into the online context.Keywords: communities of practice, e-learning, educational reform, instructional design, professional development, professional learning communities, technology, training
Procedia PDF Downloads 340