Search results for: computer based instruction
27856 Quantitative Analysis of Camera Setup for Optical Motion Capture Systems
Authors: J. T. Pitale, S. Ghassab, H. Ay, N. Berme
Abstract:
Biomechanics researchers commonly use marker-based optical motion capture (MoCap) systems to extract human body kinematic data. These systems use cameras to detect passive or active markers placed on the subject. The cameras use triangulation methods to form images of the markers, which typically require each marker to be visible by at least two cameras simultaneously. Cameras in a conventional optical MoCap system are mounted at a distance from the subject, typically on walls, ceiling as well as fixed or adjustable frame structures. To accommodate for space constraints and as portable force measurement systems are getting popular, there is a need for smaller and smaller capture volumes. When the efficacy of a MoCap system is investigated, it is important to consider the tradeoff amongst the camera distance from subject, pixel density, and the field of view (FOV). If cameras are mounted relatively close to a subject, the area corresponding to each pixel reduces, thus increasing the image resolution. However, the cross section of the capture volume also decreases, causing reduction of the visible area. Due to this reduction, additional cameras may be required in such applications. On the other hand, mounting cameras relatively far from the subject increases the visible area but reduces the image quality. The goal of this study was to develop a quantitative methodology to investigate marker occlusions and optimize camera placement for a given capture volume and subject postures using three-dimension computer-aided design (CAD) tools. We modeled a 4.9m x 3.7m x 2.4m (LxWxH) MoCap volume and designed a mounting structure for cameras using SOLIDWORKS (Dassault Systems, MA, USA). The FOV was used to generate the capture volume for each camera placed on the structure. A human body model with configurable posture was placed at the center of the capture volume on CAD environment. We studied three postures; initial contact, mid-stance, and early swing. The human body CAD model was adjusted for each posture based on the range of joint angles. Markers were attached to the model to enable a full body capture. The cameras were placed around the capture volume at a maximum distance of 2.7m from the subject. We used the Camera View feature in SOLIDWORKS to generate images of the subject as seen by each camera and the number of markers visible to each camera was tabulated. The approach presented in this study provides a quantitative method to investigate the efficacy and efficiency of a MoCap camera setup. This approach enables optimization of a camera setup through adjusting the position and orientation of cameras on the CAD environment and quantifying marker visibility. It is also possible to compare different camera setup options on the same quantitative basis. The flexibility of the CAD environment enables accurate representation of the capture volume, including any objects that may cause obstructions between the subject and the cameras. With this approach, it is possible to compare different camera placement options to each other, as well as optimize a given camera setup based on quantitative results.Keywords: motion capture, cameras, biomechanics, gait analysis
Procedia PDF Downloads 31227855 Dynamic Ad-hoc Topologies for Mobile Robot Navigation Based on Non-Uniform Grid Maps
Authors: Peter Sauer, Thomas Hinze, Petra Hofstedt
Abstract:
To avoid obstacles in the surrounding environment and to navigate to a given target belong to the most important tasks for mobile robots. According to these tasks different data structures are suitable. To avoid near obstacles, occupancy grid maps are an ideal representation of the surroundings. For less fine grained tasks, such as navigating from one room to another in an apartment, pure grid maps are inappropriate. Grid maps are very detailed, calculating paths to navigate between rooms based on grid maps would take too long. Instead, graph-based data structures, so-called topologies, turn out to be a proper choice for such tasks. In this paper we present two methods to dynamically create topologies from grid maps. Both methods are based on non-uniform grid maps. The topologies are generated on-the-fly and can easily be modified to represent changes in the environment. This allows a hybrid approach to control mobile robots, where, depending on the situation and the current task, either the grid map or the generated topology may be used.Keywords: robot navigation, occupancy grids, topological maps, dynamic map creation
Procedia PDF Downloads 56627854 Analyses of Adverse Drug Reactions Reported of Hospital in Taiwan
Authors: Yu-Hong Lin
Abstract:
Background: An adverse drug reaction (ADR) reported is an injury which caused by taking medicines. Sometimes the severity of ADR reported may be minor, but sometimes it could be a life-threatening situation. In order to provide healthcare professionals as a better reference in clinical practice, we do data collection and analysis from our hospital. Methods: This was a retrospective study of ADRs reported performed from 2014 to 2015 in our hospital in Taiwan. We collected assessment items of ADRs reported, which contain gender and age, occurring sources, Anatomical Therapeutic Chemical (ATC) classification of suspected drugs, types of adverse reactions, Naranjo score calculating by Naranjo Adverse Drug Reaction Probability Scale and so on. Results: The investigation included two hundred and seven ADRs reported. Most of ADRs reported were occurring in outpatient department (92%). The average age of ADRs reported was 65.3 years. Less than 65 years of age were in the majority in this study (54%). Majority of all ADRs reported were males (51%). According to ATC classification system, the major classification of suspected drugs was cardiovascular system (19%) and antiinfectives for systemic use (18%) respectively. Among the adverse reactions, Dermatologic Effects (35%) were the major type of ADRs. Also, the major Naranjo scores of all ADRs reported ranged from 1 to 4 points (91%), which represents a possible correlation between ADRs reported and suspected drugs. Conclusions: Definitely, ADRs reported is still an extremely important information for healthcare professionals. For that reason, we put all information of ADRs reported into our hospital's computer system, and it will improve the safety of medication use. By hospital's computer system, it can remind prescribers to think of information about patient's ADRs reported. No drugs are administered without risk. Therefore, all healthcare professionals should have a responsibility to their patients, who themselves are becoming more aware of problems associated with drug therapy.Keywords: adverse drug reaction, Taiwan, healthcare professionals, safe use of medicines
Procedia PDF Downloads 23527853 Inflation and Deflation of Aircraft's Tire with Intelligent Tire Pressure Regulation System
Authors: Masoud Mirzaee, Ghobad Behzadi Pour
Abstract:
An aircraft tire is designed to tolerate extremely heavy loads for a short duration. The number of tires increases with the weight of the aircraft, as it is needed to be distributed more evenly. Generally, aircraft tires work at high pressure, up to 200 psi (14 bar; 1,400 kPa) for airliners and higher for business jets. Tire assemblies for most aircraft categories provide a recommendation of compressed nitrogen that supports the aircraft’s weight on the ground, including a mechanism for controlling the aircraft during taxi, takeoff; landing; and traction for braking. Accurate tire pressure is a key factor that enables tire assemblies to perform reliably under high static and dynamic loads. Concerning ambient temperature change, considering the condition in which the temperature between the origin and destination airport was different, tire pressure should be adjusted and inflated to the specified operating pressure at the colder airport. This adjustment superseding the normal tire over an inflation limit of 5 percent at constant ambient temperature is required because the inflation pressure remains constant to support the load of a specified aircraft configuration. On the other hand, without this adjustment, a tire assembly would be significantly under/over-inflated at the destination. Due to an increase of human errors in the aviation industry, exorbitant costs are imposed on the airlines for providing consumable parts such as aircraft tires. The existence of an intelligent system to adjust the aircraft tire pressure based on weight, load, temperature, and weather conditions of origin and destination airports, could have a significant effect on reducing the aircraft maintenance costs, aircraft fuel and further improving the environmental issues related to the air pollution. An intelligent tire pressure regulation system (ITPRS) contains a processing computer, a nitrogen bottle with 1800 psi, and distribution lines. Nitrogen bottle’s inlet and outlet valves are installed in the main wheel landing gear’s area and are connected through nitrogen lines to main wheels and nose wheels assy. Controlling and monitoring of nitrogen will be performed by a computer, which is adjusted according to the calculations of received parameters, including the temperature of origin and destination airport, the weight of cargo loads and passengers, fuel quantity, and wind direction. Correct tire inflation and deflation are essential in assuring that tires can withstand the centrifugal forces and heat of normal operations, with an adequate margin of safety for unusual operating conditions such as rejected takeoff and hard landings. ITPRS will increase the performance of the aircraft in all phases of takeoff, landing, and taxi. Moreover, this system will reduce human errors, consumption materials, and stresses imposed on the aircraft body.Keywords: avionic system, improve efficiency, ITPRS, human error, reduced cost, tire pressure
Procedia PDF Downloads 25627852 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases
Authors: Hao-Hsiang Ku, Ching-Ho Chi
Abstract:
Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system
Procedia PDF Downloads 26527851 The Importance of Developing Pedagogical Agency Capacities in Initial Teacher Formation: A Critical Approach to Advance in Social Justice
Authors: Priscilla Echeverria
Abstract:
This paper addresses initial teacher formation as a formative space in which pedagogy students develop a pedagogical agency capacity to contribute to social justice, considering ethical, political, and epistemic dimensions. This paper is structured by discussing first the concepts of agency, pedagogical interaction, and social justice from a critical perspective; and continues offering preliminary results on the capacity of pedagogical agency in novice teachers after the analysis of critical incidents as a research methodology. This study is motivated by the concern that responding to the current neoliberal scenario, many initial teacher formation (ITF) programs have reduced the meaning of education to instruction, and pedagogy to methodology, favouring the formation of a technical professional over a reflective or critical one. From this concern, this study proposes that the restitution of the subject is an urgent task in teacher formation, so it is essential to enable him in his capacity for action and advance in eliminating institutionalized oppression insofar as it affects that capacity. Given that oppression takes place in human interaction, through this work, I propose that initial teacher formation develops sensitivity and educates the gaze to identify oppression and take action against it, both in pedagogical interactions -which configure political, ethical, and epistemic subjectivities- as in the hidden and official curriculum. All this from the premise that modelling democratic and dialogical interactions are basic for any program that seeks to contribute to a more just and empowered society. The contribution of this study lies in the fact that it opens a discussion in an area about which we know little: the impact of the type of interactions offered by university teaching at ITF on the capacity of future teachers to be pedagogical agents. For this reason, this study seeks to gather evidence of the result of this formation, analysing the capacity of pedagogical agency of novice teachers, or, in other words, how capable the graduates of secondary pedagogies are in their first pedagogical experiences to act and make decisions putting the formative purposes that they are capable of autonomously defining before technical or bureaucratic issues imposed by the curriculum or the official culture. This discussion is part of my doctoral research, "The importance of developing the capacity for ethical-political-epistemic agency in novice teachers during initial teacher formation to contribute to social justice", which I am currently developing in the Educational Research program of the University of Lancaster, United Kingdom, as a Conicyt fellow for the 2019 cohort.Keywords: initial teacher formation, pedagogical agency, pedagogical interaction, social justice, hidden curriculum
Procedia PDF Downloads 10327850 A Comparative Analysis of E-Government Quality Models
Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri
Abstract:
Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126
Procedia PDF Downloads 56227849 A High Quality Factor Filter Based on Quasi- Periodic Photonic Structure
Authors: Hamed Alipour-Banaei, Farhad Mehdizadeh
Abstract:
We report the design and characterization of ultra high quality factor filter based on one-dimensional photonic-crystal Thue-Morse sequence structure. The behavior of aperiodic array of photonic crystal structure is numerically investigated and we show that by changing the angle of incident wave, desired wavelengths could be tuned and a tunable filter is realized. Also it is shown that high quality factor filter be achieved in the telecommunication window around 1550 nm, with a device based on Thue-Morse structure. Simulation results show that the proposed structure has a quality factor more than 100000 and it is suitable for DWDM communication applications.Keywords: Thue-Morse, filter, quality factor, photonic
Procedia PDF Downloads 57527848 Film Therapy on Adolescent Body Image: A Pilot Study
Authors: Sonia David, Uma Warrier
Abstract:
Background: Film therapy is the use of commercial or non-commercial films to enhance healing for therapeutic purposes. Objectives: The mixed-method study aims to evaluate the effect of film-based counseling on body image dissatisfaction among adolescents to precisely ascertain the cause of the alteration in body image dissatisfaction due to the said intervention. Method: The one group pre-test post-test research design study using inferential statistics and thematic analysis is based on a pre-test post-test design conducted on 44 school-going adolescents between 13 and 17. The Body Shape Questionnaire (BSQ- 34) was used as a pre-test and post-test measure. The film-based counseling intervention model was used through individual counseling sessions. The analysis involved paired sample t-test used to examine the data quantitatively, and thematic analysis was used to evaluate qualitative data. Findings: The results indicated that there is a significant difference between the pre-test and post-test means. Since t(44)= 9.042 is significant at a 99% confidence level, it is ascertained that film-based counseling intervention reduces body image dissatisfaction. The five distinct themes from the thematic analysis are “acceptance, awareness, empowered to change, empathy, and reflective.” Novelty: The paper originally contributes to the repertoire of research on film therapy as a successful counseling intervention for addressing the challenges of body image dissatisfaction. This study also opens avenues for considering alteration of teaching pedagogy to include video-based learning in various subjects.Keywords: body image dissatisfaction, adolescents, film-based counselling, film therapy, acceptance and commitment therapy
Procedia PDF Downloads 29927847 Dealing with the Spaces: Ultra Conservative Approach from Childhood to Adulthood
Authors: Maryam Firouzmandi, Moosa Miri
Abstract:
Common reasons for early tooth loss are trauma, extraction due to caries or periodontal disease and congenital missing. The remaining space after tooth loss may cause functional and esthetic problems. Therefore restorative dentists should attempt to manage these spaces using conservative methods. The goal is to restore the lost esthetic and function, prevent phonetic, self-esteem and personality problems and tongue habits. Preserving alveolar bone is also of great importance during the growth stage. Purpose: When deciding about the management of the missing tooth, space implants are contradicted until the completion of dentoalveolar development. Even in adulthood, due to systemic or periodontal problems or biological and economic issues, the implant might not be indicated. In this article, the alternative conservative restorative methods of space maintenance are going to be discussed. Essix retainers are made chair-side as easy as forming a custom bleaching tray with some modifications. They are esthetically acceptable and not expensive. These temporaries provide support for the lips but could not be used during function. Mini-screw-supported temporaries are another option for maintaining the space, especially after orthodontic treatment when there is a time lag between the termination of orthodontic treatment and definitive restoration. Two techniques will be presented for this kind of restoration: Denture tooth pontic or a composite crown. The benefits are alveolar bone preservation, Physiologic pressure on the alveolar ridge to increase its density and even can be retained until the completion of the definitive treatment. Bonded fixed partial denture includes Maryland bridge, fiber-reinforced composite bridge, resin-bonded bridge, and ceramic bonded bridge. These types of bridges are recommended to be used after a pubertal growth spurt and a recent meta-analysis considered their clinical success similar to conventional FDPs and implant-supported crowns. However, they have several advantages that are going to be discussed by presenting some clinical examples. Practical instruction on how to construct an FRC bridge and a novel chair-side Maryland bridge will be given by means of clinical cases. Clinical relevance: minimally invasive options should always be considered and destruction of healthy enamel and dentin during the preparation phase should be avoided as much as possible.Keywords: tooth missing, fiber-reinforced composite, Maryland, Essix retainers, screw-retained restoration
Procedia PDF Downloads 20027846 Off-Policy Q-learning Technique for Intrusion Response in Network Security
Authors: Zheni S. Stefanova, Kandethody M. Ramachandran
Abstract:
With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.Keywords: cyber security, intrusion prevention, optimal policy, Q-learning
Procedia PDF Downloads 24427845 Social Collaborative Learning Model Based on Proactive Involvement to Promote the Global Merit Principle in Cultivating Youths' Morality
Authors: Wera Supa, Panita Wannapiroon
Abstract:
This paper is a report on the designing of the social collaborative learning model based on proactive involvement to Promote the global merit principle in cultivating youths’ morality. The research procedures into two phases, the first phase is to design the social collaborative learning model based on proactive involvement to promote the global merit principle in cultivating youths’ morality, and the second is to evaluate the social collaborative learning model based on proactive involvement. The sample group in this study consists of 15 experts who are dominant in proactive participation, moral merit principle and youths’ morality cultivation from executive level, lecturers and the professionals in information and communication technology expertise selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. This study has explored that there are four significant factors in promoting the hands-on collaboration of global merit scheme in order to implant virtues to adolescences which are: 1) information and communication Technology Usage; 2) proactive involvement; 3) morality cultivation policy, and 4) global merit principle. The experts agree that the social collaborative learning model based on proactive involvement is highly appropriate.Keywords: social collaborative learning, proactive involvement, global merit principle, morality
Procedia PDF Downloads 39127844 Faculty Use of Geospatial Tools for Deep Learning in Science and Engineering Courses
Authors: Laura Rodriguez Amaya
Abstract:
Advances in science, technology, engineering, and mathematics (STEM) are viewed as important to countries’ national economies and their capacities to be competitive in the global economy. However, many countries experience low numbers of students entering these disciplines. To strengthen the professional STEM pipelines, it is important that students are retained in these disciplines at universities. Scholars agree that to retain students in universities’ STEM degrees, it is necessary that STEM course content shows the relevance of these academic fields to their daily lives. By increasing students’ understanding on the importance of these degrees and careers, students’ motivation to remain in these academic programs can also increase. An effective way to make STEM content relevant to students’ lives is the use of geospatial technologies and geovisualization in the classroom. The Geospatial Revolution, and the science and technology associated with it, has provided scientists and engineers with an incredible amount of data about Earth and Earth systems. This data can be used in the classroom to support instruction and make content relevant to all students. The purpose of this study was to find out the prevalence use of geospatial technologies and geovisualization as teaching practices in a USA university. The Teaching Practices Inventory survey, which is a modified version of the Carl Wieman Science Education Initiative Teaching Practices Inventory, was selected for the study. Faculty in the STEM disciplines that participated in a summer learning institute at a 4-year university in the USA constituted the population selected for the study. One of the summer learning institute’s main purpose was to have an impact on the teaching of STEM courses, particularly the teaching of gateway courses taken by many STEM majors. The sample population for the study is 97.5 of the total number of summer learning institute participants. Basic descriptive statistics through the Statistical Package for the Social Sciences (SPSS) were performed to find out: 1) The percentage of faculty using geospatial technologies and geovisualization; 2) Did the faculty associated department impact their use of geospatial tools?; and 3) Did the number of years in a teaching capacity impact their use of geospatial tools? Findings indicate that only 10 percent of respondents had used geospatial technologies, and 18 percent had used geospatial visualization. In addition, the use of geovisualization among faculty of different disciplines was broader than the use of geospatial technologies. The use of geospatial technologies concentrated in the engineering departments. Data seems to indicate the lack of incorporation of geospatial tools in STEM education. The use of geospatial tools is an effective way to engage students in deep STEM learning. Future research should look at the effect on student learning and retention in science and engineering programs when geospatial tools are used.Keywords: engineering education, geospatial technology, geovisualization, STEM
Procedia PDF Downloads 25527843 Moderating Effects of Future Career Interest in Science and Gender on Students' Achievement in Basic Science in Oyo State, Nigeria
Authors: Segun Jacob Ogunkunle
Abstract:
The study examined the moderating effects of future career interest in science and gender on achievement in basic science of students taught in a simulated laboratory and enriched laboratory guide material environments. It adopted the pretest-posttest control group quasi experimental design with a 3x2x2 factorial matrix. A total of 277 (130 males, 147 females; ± 17 years) junior secondary three students randomly selected from six purposively selected secondary schools based on availability of functional computer and physics laboratories participated in the study. Data were collected using achievement test in basic science (r=0.87) and future career interest in science (r=0.99) while analysis of covariance and estimated marginal means were used to test three hypotheses at 0.05 level of significance. The findings of the study show that future career interest in science had significant effect on students’ achievement in basic science whereas gender did not. The interaction effect of future career interest in science and gender on students’ achievement in basic science was not significant. It is therefore recommended that prior knowledge of students’ future career interest in science could be used to improve participation in basic science practical in order to enhance achievement in biology, chemistry, and physics at the post-basic education level in Nigeria.Keywords: future career interest in science, basic science, simulated laboratory, enriched laboratory guide materials, achievement in science
Procedia PDF Downloads 16327842 Sampling Two-Channel Nonseparable Wavelets and Its Applications in Multispectral Image Fusion
Authors: Bin Liu, Weijie Liu, Bin Sun, Yihui Luo
Abstract:
In order to solve the problem of lower spatial resolution and block effect in the fusion method based on separable wavelet transform in the resulting fusion image, a new sampling mode based on multi-resolution analysis of two-channel non separable wavelet transform, whose dilation matrix is [1,1;1,-1], is presented and a multispectral image fusion method based on this kind of sampling mode is proposed. Filter banks related to this kind of wavelet are constructed, and multiresolution decomposition of the intensity of the MS and panchromatic image are performed in the sampled mode using the constructed filter bank. The low- and high-frequency coefficients are fused by different fusion rules. The experiment results show that this method has good visual effect. The fusion performance has been noted to outperform the IHS fusion method, as well as, the fusion methods based on DWT, IHS-DWT, IHS-Contourlet transform, and IHS-Curvelet transform in preserving both spectral quality and high spatial resolution information. Furthermore, when compared with the fusion method based on nonsubsampled two-channel non separable wavelet, the proposed method has been observed to have higher spatial resolution and good global spectral information.Keywords: image fusion, two-channel sampled nonseparable wavelets, multispectral image, panchromatic image
Procedia PDF Downloads 44527841 Sleep Scheduling Schemes Integrating Relay Node and User Equipment in LTE-A
Authors: Chun-Chuan Yang, Jeng-Yueng Chen, Yi-Ting Mai, Hsieh-Hua Liu
Abstract:
By introduction of Relay Nodes (RNs), LTE-Advanced can provide enhanced coverage and capacity at cell edges and hot-spot areas. The authors have been researching the issue of power saving in mobile communications technology such as WiMax and LTE for some years. Based on the idea of Load-Based Power Saving (LBPS), three efficient power saving schemes for the user equipment (UE) were proposed in the authors’ previous work. In this paper, three revised schemes of the previous work in order to integrate RN and UE in power saving are proposed. Simulation study shows the proposed schemes can achieve significantly better power saving efficiency than the standard based scheme at the cost of moderately increased delay.Keywords: DRX, LTE-A, power saving, RN
Procedia PDF Downloads 52727840 Performance Comparison of Wideband Covariance Matrix Sparse Representation (W-CMSR) with Other Wideband DOA Estimation Methods
Authors: Sandeep Santosh, O. P. Sahu
Abstract:
In this paper, performance comparison of wideband covariance matrix sparse representation (W-CMSR) method with other existing wideband Direction of Arrival (DOA) estimation methods has been made.W-CMSR relies less on a priori information of the incident signal number than the ordinary subspace based methods.Consider the perturbation free covariance matrix of the wideband array output. The diagonal covariance elements are contaminated by unknown noise variance. The covariance matrix of array output is conjugate symmetric i.e its upper right triangular elements can be represented by lower left triangular ones.As the main diagonal elements are contaminated by unknown noise variance,slide over them and align the lower left triangular elements column by column to obtain a measurement vector.Simulation results for W-CMSR are compared with simulation results of other wideband DOA estimation methods like Coherent signal subspace method (CSSM), Capon, l1-SVD, and JLZA-DOA. W-CMSR separate two signals very clearly and CSSM, Capon, L1-SVD and JLZA-DOA fail to separate two signals clearly and an amount of pseudo peaks exist in the spectrum of L1-SVD.Keywords: W-CMSR, wideband direction of arrival (DOA), covariance matrix, electrical and computer engineering
Procedia PDF Downloads 47427839 An Analytical Approach of Computational Complexity for the Method of Multifluid Modelling
Authors: A. K. Borah, A. K. Singh
Abstract:
In this paper we deal building blocks of the computer simulation of the multiphase flows. Whole simulation procedure can be viewed as two super procedures; The implementation of VOF method and the solution of Navier Stoke’s Equation. Moreover, a sequential code for a Navier Stoke’s solver has been studied.Keywords: Bi-conjugate gradient stabilized (Bi-CGSTAB), ILUT function, krylov subspace, multifluid flows preconditioner, simple algorithm
Procedia PDF Downloads 53027838 “I” on the Web: Social Penetration Theory Revised
Authors: Dr. Dionysis Panos Dpt. Communication, Internet Studies Cyprus University of Technology
Abstract:
The widespread use of New Media and particularly Social Media, through fixed or mobile devices, has changed in a staggering way our perception about what is “intimate" and "safe" and what is not, in interpersonal communication and social relationships. The distribution of self and identity-related information in communication now evolves under new and different conditions and contexts. Consequently, this new framework forces us to rethink processes and mechanisms, such as what "exposure" means in interpersonal communication contexts, how the distinction between the "private" and the "public" nature of information is being negotiated online, how the "audiences" we interact with are understood and constructed. Drawing from an interdisciplinary perspective that combines sociology, communication psychology, media theory, new media and social networks research, as well as from the empirical findings of a longitudinal comparative research, this work proposes an integrative model for comprehending mechanisms of personal information management in interpersonal communication, which can be applied to both types of online (Computer-Mediated) and offline (Face-To-Face) communication. The presentation is based on conclusions drawn from a longitudinal qualitative research study with 458 new media users from 24 countries for almost over a decade. Some of these main conclusions include: (1) There is a clear and evidenced shift in users’ perception about the degree of "security" and "familiarity" of the Web, between the pre- and the post- Web 2.0 era. The role of Social Media in this shift was catalytic. (2) Basic Web 2.0 applications changed dramatically the nature of the Internet itself, transforming it from a place reserved for “elite users / technical knowledge keepers" into a place of "open sociability” for anyone. (3) Web 2.0 and Social Media brought about a significant change in the concept of “audience” we address in interpersonal communication. The previous "general and unknown audience" of personal home pages, converted into an "individual & personal" audience chosen by the user under various criteria. (4) The way we negotiate the nature of 'private' and 'public' of the Personal Information, has changed in a fundamental way. (5) The different features of the mediated environment of online communication and the critical changes occurred since the Web 2.0 advance, lead to the need of reconsideration and updating the theoretical models and analysis tools we use in our effort to comprehend the mechanisms of interpersonal communication and personal information management. Therefore, is proposed here a new model for understanding the way interpersonal communication evolves, based on a revision of social penetration theory.Keywords: new media, interpersonal communication, social penetration theory, communication exposure, private information, public information
Procedia PDF Downloads 37627837 Detection of Intentional Attacks in Images Based on Watermarking
Authors: Hazem Munawer Al-Otum
Abstract:
In this work, an efficient watermarking technique is proposed and can be used for detecting intentional attacks in RGB color images. The proposed technique can be implemented for image authentication and exhibits high robustness against unintentional common image processing attacks. It deploys two measures to discern between intentional and unintentional attacks based on using a quantization-based technique in a modified 2D multi-pyramidal DWT transform. Simulations have shown high accuracy in detecting intentionally attacked regions while exhibiting high robustness under moderate to severe common image processing attacks.Keywords: image authentication, copyright protection, semi-fragile watermarking, tamper detection
Procedia PDF Downloads 26027836 A Cross-Sectional Study on Evaluation of Studies Conducted on Women in Turkey
Authors: Oya Isik, Filiz Yurtal, Kubilay Vursavus, Muge K. Davran, Metehan Celik, Munire Akgul, Olcay Karacan
Abstract:
In this study, to discuss the causes and problems of women by bringing together different disciplines engaged in women's studies were aimed. Also, to solve these problems, to share information and experiences in different disciplines about women, and to reach the task areas and decision mechanisms in practice were other objectives. For this purpose, proceedings presented at the Second Congress of Women's Studies held in Adana, Turkey, on 28-30 November 2018 was evaluated. The document analysis model, which is one of the qualitative research methods, was used in the evaluation of the congress proceedings. A total of 86 papers were presented in the congress and the topic distributions of the papers were determined. At the evaluation stage, the papers were classified according to their subjects and descriptive analyses were made on the papers. According to the analysis results of the papers presented in the congress, 64 % of the total 86 papers presented in the Congress were review-based and 36 % were research-based studies. When the distribution of these reports was examined based on subject, the biggest share with the rate of 34.9% (13 reviews and 17 research-based papers) has been studied on women's issues through sociology, psychology and philosophy. This was followed by the economy, employment, organization, and non-governmental organizations with 20.9% (9 reviews and nine research-based papers), arts and literature with 17.4% (15 reviews based papers) and law with 12.8% (11 reviews based papers). The lowest share of the congress was presented in politics with one review based paper (1.2%), health with two research-based paper (2.3%), history with two reviews based papers (2.3%), religion with two reviews and one research-based papers (3.5%) and media-communication with two compilations and two researches based papers (4.7%). In the papers categorized under main headings, women were examined in terms of gender and gender roles. According to the results, it was determined that discrimination against women continued, changes in-laws were not put into practice sufficiently, education and economic independence levels of women were insufficient, and violence against women continued increasingly. To eliminate all these problems and to make the society conscious, it was decided that scientific studies should be supported. Furthermore, support policies should be realized jointly for women and men to make women visible in public life, tolerance or mitigation should not be put forward for any reason or in any group in cases of harassment and assault against women. However, it has been determined that women in Turkey should be in a better position in the social, cultural, psychological, economic and educational areas, and future studies should be carried out to improve women's rights and to create a positive perspective.Keywords: gender, gender roles, sociology, psychology and philosophy, women studies
Procedia PDF Downloads 15127835 Self-Tuning Robot Control Based on Subspace Identification
Authors: Mathias Marquardt, Peter Dünow, Sandra Baßler
Abstract:
The paper describes the use of subspace based identification methods for auto tuning of a state space control system. The plant is an unstable but self balancing transport robot. Because of the unstable character of the process it has to be identified from closed loop input-output data. Based on the identified model a state space controller combined with an observer is calculated. The subspace identification algorithm and the controller design procedure is combined to a auto tuning method. The capability of the approach was verified in a simulation experiments under different process conditions.Keywords: auto tuning, balanced robot, closed loop identification, subspace identification
Procedia PDF Downloads 38527834 Multiscale Edge Detection Based on Nonsubsampled Contourlet Transform
Authors: Enqing Chen, Jianbo Wang
Abstract:
It is well known that the wavelet transform provides a very effective framework for multiscale edges analysis. However, wavelets are not very effective in representing images containing distributed discontinuities such as edges. In this paper, we propose a novel multiscale edge detection method in nonsubsampled contourlet transform (NSCT) domain, which is based on the dominant multiscale, multidirection edge expression and outstanding edge location of NSCT. Through real images experiments, simulation results demonstrate that the proposed method is better than other edge detection methods based on Canny operator, wavelet and contourlet. Additionally, the proposed method also works well for noisy images.Keywords: edge detection, NSCT, shift invariant, modulus maxima
Procedia PDF Downloads 49427833 Survivable IP over WDM Network Design Based on 1 ⊕ 1 Network Coding
Authors: Nihed Bahria El Asghar, Imen Jouili, Mounir Frikha
Abstract:
Inter-datacenter transport network is very bandwidth and delay demanding. The data transferred over such a network is also highly QoS-exigent mostly because a huge volume of data should be transported transparently with regard to the application user. To avoid the data transfer failure, a backup path should be reserved. No re-routing delay should be observed. A dedicated 1+1 protection is however not applicable in inter-datacenter transport network because of the huge spare capacity. In this context, we propose a survivable virtual network with minimal backup based on network coding (1 ⊕ 1) and solve it using a modified Dijkstra-based heuristic.Keywords: network coding, dedicated protection, spare capacity, inter-datacenters transport network
Procedia PDF Downloads 45027832 Introducing a Dynamic Factor-Based Predictive Maintenance Model for Optimizing Resource Allocation in Complex Systems
Authors: Joel Leonard, Johann Wannenburg
Abstract:
Instead of relying on predetermined schedules or calendar intervals, Usage-Based Maintenance (UBM) is a proactive maintenance method that initiates maintenance operations based on the actual usage of the equipment. This is in contrast to traditional maintenance methods of corrective maintenance and time-based preventive maintenance. The precision and applicability of a usage-based model rely on its exactness when modelling the actual system. However, excessive simplification of usage-based maintenance models may omit all active failure modes that arise during equipment use, leading to inaccurate predictions and ineffective interventions. This paper presents a unique exponential-based predictive maintenance model that simultaneously considers the impact of multiple failure modes during equipment usage. The model integrates time-dependent deterioration dynamics and operational thresholds derived from basic principles. The proposed model integrates a growth-modulating factor governed by a baseline parameter against a system-specific usage threshold, which calculates the time to failure of the component under usage. Within the context of a chemical processing facility, a case study application of the formula is applied to data from the operating history of essential components subjected to four (4) different failure modes. These failure modes include fatigue, corrosion, erosion and wear. The study's findings demonstrate the formula's practical implementation and illustrate the impact that numerous failure modes working simultaneously on a single component might have due to operational stress on the time to failure. This technique provides a dynamic framework, allowing for predicting failure probability, optimizing maintenance schedules, and improving resource allocation in critical systems such as those in the chemical, aerospace, energy, and manufacturing industries.Keywords: usage-based maintenance (UBM), failure modes, predictive maintenance model, operational thresholds, and maintenance optimization
Procedia PDF Downloads 427831 Thermal Conductivity of Al2O3/Water-Based Nanofluids: Revisiting the Influences of pH and Surfactant
Authors: Nizar Bouguerra, Ahmed Khabou, Sébastien Poncet, Saïd Elkoun
Abstract:
The present work focuses on the preparation and the stabilization of Al2O3-water based nanofluids. Though they have been widely considered in the past, to the best of our knowledge, there is no clear consensus about a proper way to prepare and stabilize them by the appropriate surfactant. In this paper, a careful experimental investigation is performed to quantify the combined influence of pH and the surfactant on the stability of Al2O3-water based nanofluids. Two volume concentrations of nanoparticles and three nanoparticle sizes have been considered. The good preparation and stability of these nanofluids are evaluated through thermal conductivity measurements. The results show that the optimum value for the thermal conductivity is obtained mainly by controlling the pH of the mixture and surfactants are not necessary to stabilize the solution.Keywords: nanofluid, thermal conductivity, pH, transient hot wire, surfactant, Al2O3, stability, dispersion, preparation
Procedia PDF Downloads 36227830 Lightweight Hardware Firewall for Embedded System Based on Bus Transactions
Authors: Ziyuan Wu, Yulong Jia, Xiang Zhang, Wanting Zhou, Lei Li
Abstract:
The Internet of Things (IoT) is a rapidly evolving field involving a large number of interconnected embedded devices. In the design of embedded System-on-Chip (SoC), the key issues are power consumption, performance, and security. However, the easy-to-implement software and untrustworthy third-party IP cores may threaten the safety of hardware assets. Considering that illegal access and malicious attacks against SoC resources pass through the bus that integrates IPs, we propose a Lightweight Hardware Firewall (LHF) to protect SoC, which monitors and disallows the offending bus transactions based on physical addresses. Furthermore, under the LHF architecture, this paper refines two types of firewalls: Destination Hardware Firewall (DHF) and Source Hardware Firewall (SHF). The former is oriented to fine-grained detection and configuration, whose core technology is based on the method of dynamic grading units. In addition, we design the SHF based on static entries to achieve lightweight. Finally, we evaluate the hardware consumption of the proposed method by both Field-Programmable Gate Array (FPGA) and IC. Compared with the exciting efforts, LHF introduces a bus latency of zero clock cycles for every read or write transaction implemented on Xilinx Kintex-7 FPGAs. Meanwhile, the DC synthesis results based on TSMC 90nm show that the area is reduced by about 25% compared with the previous method.Keywords: IoT, security, SoC, bus architecture, lightweight hardware firewall, FPGA
Procedia PDF Downloads 6827829 A Study of Learning to Enhance Ability Career Skills Consistent With Disruptive Innovation in Creative Strategies for Advertising Course
Authors: Kornchanok Chidchaisuwan
Abstract:
This project is a study of learning activities through experience to enhance career skills and technical abilities on the creative strategies for advertising course of undergraduate students. This instructional model consisted of study learning approaches: 1) Simulation-based learning: used to create virtual learning activities plans for work like working at advertising companies. 2) Project-based learning: Actual work based on the processed creating and focus on producing creative works to present on new media channels. The results of learning management found that there were effects on the students in various areas, including 1) The learners have experienced in the step by step of advertising work process. 2) The learner has the skills to work from the actual work (Learning by Doing), allowing the ability to create, present, and produce the campaign accomplished achievements and published on online media at a better level.Keywords: technical, advertising, presentation, career skills, experience, simulation based learning
Procedia PDF Downloads 9327828 The Importance of Visual Communication in Artificial Intelligence
Authors: Manjitsingh Rajput
Abstract:
Visual communication plays an important role in artificial intelligence (AI) because it enables machines to understand and interpret visual information, similar to how humans do. This abstract explores the importance of visual communication in AI and emphasizes the importance of various applications such as computer vision, object emphasis recognition, image classification and autonomous systems. In going deeper, with deep learning techniques and neural networks that modify visual understanding, In addition to AI programming, the abstract discusses challenges facing visual interfaces for AI, such as data scarcity, domain optimization, and interpretability. Visual communication and other approaches, such as natural language processing and speech recognition, have also been explored. Overall, this abstract highlights the critical role that visual communication plays in advancing AI capabilities and enabling machines to perceive and understand the world around them. The abstract also explores the integration of visual communication with other modalities like natural language processing and speech recognition, emphasizing the critical role of visual communication in AI capabilities. This methodology explores the importance of visual communication in AI development and implementation, highlighting its potential to enhance the effectiveness and accessibility of AI systems. It provides a comprehensive approach to integrating visual elements into AI systems, making them more user-friendly and efficient. In conclusion, Visual communication is crucial in AI systems for object recognition, facial analysis, and augmented reality, but challenges like data quality, interpretability, and ethics must be addressed. Visual communication enhances user experience, decision-making, accessibility, and collaboration. Developers can integrate visual elements for efficient and accessible AI systems.Keywords: visual communication AI, computer vision, visual aid in communication, essence of visual communication.
Procedia PDF Downloads 10127827 'Performance-Based' Seismic Methodology and Its Application in Seismic Design of Reinforced Concrete Structures
Authors: Jelena R. Pejović, Nina N. Serdar
Abstract:
This paper presents an analysis of the “Performance-Based” seismic design method, in order to overcome the perceived disadvantages and limitations of the existing seismic design approach based on force, in engineering practice. Bearing in mind, the specificity of the earthquake as a load and the fact that the seismic resistance of the structures solely depends on its behaviour in the nonlinear field, traditional seismic design approach based on force and linear analysis is not adequate. “Performance-Based” seismic design method is based on nonlinear analysis and can be used in everyday engineering practice. This paper presents the application of this method to eight-story high reinforced concrete building with combined structural system (reinforced concrete frame structural system in one direction and reinforced concrete ductile wall system in other direction). The nonlinear time-history analysis is performed on the spatial model of the structure using program Perform 3D, where the structure is exposed to forty real earthquake records. For considered building, large number of results were obtained. It was concluded that using this method we could, with a high degree of reliability, evaluate structural behavior under earthquake. It is obtained significant differences in the response of structures to various earthquake records. Also analysis showed that frame structural system had not performed well at the effect of earthquake records on soil like sand and gravel, while a ductile wall system had a satisfactory behavior on different types of soils.Keywords: ductile wall, frame system, nonlinear time-history analysis, performance-based methodology, RC building
Procedia PDF Downloads 369