Search results for: wavelet domain
927 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage
Authors: Andrew Laming, John Hattie, Mark Wilson
Abstract:
Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean
Procedia PDF Downloads 68926 Electromagnetic Wave Propagation Equations in 2D by Finite Difference Method
Authors: N. Fusun Oyman Serteller
Abstract:
In this paper, the techniques to solve time dependent electromagnetic wave propagation equations based on the Finite Difference Method (FDM) are proposed by comparing the results with Finite Element Method (FEM) in 2D while discussing some special simulation examples. Here, 2D dynamical wave equations for lossy media, even with a constant source, are discussed for establishing symbolic manipulation of wave propagation problems. The main objective of this contribution is to introduce a comparative study of two suitable numerical methods and to show that both methods can be applied effectively and efficiently to all types of wave propagation problems, both linear and nonlinear cases, by using symbolic computation. However, the results show that the FDM is more appropriate for solving the nonlinear cases in the symbolic solution. Furthermore, some specific complex domain examples of the comparison of electromagnetic waves equations are considered. Calculations are performed through Mathematica software by making some useful contribution to the programme and leveraging symbolic evaluations of FEM and FDM.Keywords: finite difference method, finite element method, linear-nonlinear PDEs, symbolic computation, wave propagation equations
Procedia PDF Downloads 147925 Investigating Customer Engagement through the Prism of Congruity Theory
Authors: Jamid Ul Islam, Zillur Rahman
Abstract:
The impulse for customer engagement research in online brand communities (OBCs) is largely acknowledged in the literature. Applying congruity theory, this study proposes a model of customer engagement by examining how two congruities viz. self-brand image congruity and value congruity influence customers’ engagement in online brand communities. The consequent effect of customer engagement on brand loyalty is also studied. This study collected data through a questionnaire survey of 395 students of a higher educational institute in India, who were active on Facebook and followed a brand community (at least one). The data were analyzed using structure equation modelling. The results revealed that both the types of congruity i.e., self-brand image congruity and value congruity significantly affect customer engagement. A positive effect of customer engagement on brand loyalty was also affirmed by the results. This study integrates and broadens extant explanations of different congruity effects on consumer behavior-an area that has received little attention. This study is expected to add new trends to engage customers in online brand communities and offer realistic insights to the domain of social media marketing.Keywords: congruity theory, customer engagement, Facebook, online brand communities
Procedia PDF Downloads 349924 Connecting Students and Faculty Research Efforts through the Research and Projects Portal
Authors: Havish Nalapareddy, Mark V. Albert, Ranak Bansal, Avi Udash, Lin Lin
Abstract:
Students engage in many course projects during their degree programs. However, impactful projects often need a time frame longer than a single semester. Ideally, projects are documented and structured to be readily accessible to future students who may choose to continue the project, with features that emphasize the local community, university, or course structure. The Research and Project Portal (RAPP) is a place where students can post both their completed and ongoing projects with all the resources and tools used. This portal allows students to see what other students have done in the past, in the same university environment, related to their domain of interest. Computer science instructors or students selecting projects can use this portal to assign or choose an incomplete project. Additionally, this portal allows non-computer science faculty and industry collaborators to document their project ideas for students in courses to prototype directly, rather than directly soliciting the help of instructors in engaging students. RAPP serves as a platform linking students across classes and faculty both in and out of computer science courses on joint projects to encourage long-term project efforts across semesters or years.Keywords: education, technology, research, academic portal
Procedia PDF Downloads 137923 A Hybrid Digital Watermarking Scheme
Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif
Abstract:
Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.Keywords: watermarking, image processing, DCT, LSB, PSNR
Procedia PDF Downloads 47922 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 270921 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data
Authors: Muthukumarasamy Govindarajan
Abstract:
Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine
Procedia PDF Downloads 142920 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 458919 System Identification and Controller Design for a DC Electrical Motor
Authors: Armel Asongu Nkembi, Ahmad Fawad
Abstract:
The aim of this paper is to determine in a concise way the transfer function that characterizes a DC electrical motor with a helix. In practice it can be obtained by applying a particular input to the system and then, based on the observation of its output, determine an approximation to the transfer function of the system. In our case, we use a step input and find the transfer function parameters that give the simulated first-order time response. The simulation of the system is done using MATLAB/Simulink. In order to determine the parameters, we assume a first order system and use the Broida approximation to determine the parameters and then its Mean Square Error (MSE). Furthermore, we design a PID controller for the control process first in the continuous time domain and tune it using the Ziegler-Nichols open loop process. We then digitize the controller to obtain a digital controller since most systems are implemented using computers, which are digital in nature.Keywords: transfer function, step input, MATLAB, Simulink, DC electrical motor, PID controller, open-loop process, mean square process, digital controller, Ziegler-Nichols
Procedia PDF Downloads 55918 Quantum Entangled States and Image Processing
Authors: Sanjay Singh, Sushil Kumar, Rashmi Jain
Abstract:
Quantum registering is another pattern in computational hypothesis and a quantum mechanical framework has a few helpful properties like Entanglement. We plan to store data concerning the structure and substance of a basic picture in a quantum framework. Consider a variety of n qubits which we propose to use as our memory stockpiling. In recent years classical processing is switched to quantum image processing. Quantum image processing is an elegant approach to overcome the problems of its classical counter parts. Image storage, retrieval and its processing on quantum machines is an emerging area. Although quantum machines do not exist in physical reality but theoretical algorithms developed based on quantum entangled states gives new insights to process the classical images in quantum domain. Here in the present work, we give the brief overview, such that how entangled states can be useful for quantum image storage and retrieval. We discuss the properties of tripartite Greenberger-Horne-Zeilinger and W states and their usefulness to store the shapes which may consist three vertices. We also propose the techniques to store shapes having more than three vertices.Keywords: Greenberger-Horne-Zeilinger, image storage and retrieval, quantum entanglement, W states
Procedia PDF Downloads 306917 Step Height Calibration Using Hamming Window: Band-Pass Filter
Authors: Dahi Ghareab Abdelsalam Ibrahim
Abstract:
Calibration of step heights with high accuracy is needed for many applications in the industry. In general, step height consists of three bands: pass band, transition band (roll-off), and stop band. Abdelsalam used a convolution of the transfer functions of both Chebyshev type 2 and elliptic filters with WFF of the Fresnel transform in the frequency domain for producing a steeper roll-off with the removal of ripples in the pass band- and stop-bands. In this paper, we used a new method based on the Hamming window: band-pass filter for calibration of step heights in terms of perfect adjustment of pass-band, roll-off, and stop-band. The method is applied to calibrate a nominal step height of 40 cm. The step height is measured first by asynchronous dual-wavelength phase-shift interferometry. The measured step height is then calibrated by the simulation of the Hamming window: band-pass filter. The spectrum of the simulated band-pass filter is simulated at N = 881 and f0 = 0.24. We can conclude that the proposed method can calibrate any step height by adjusting only two factors which are N and f0.Keywords: optical metrology, step heights, hamming window, band-pass filter
Procedia PDF Downloads 83916 Numerical Study of Sloshing in a Flexible Tank
Authors: Wissem Tighidet, Faïçal Naït Bouda, Moussa Allouche
Abstract:
The numerical study of the Fluid-Structure Interaction (FSI) in a partially filled flexible tank submitted to a horizontal harmonic excitation motion. It is investigated by using two-way Fluid-Structure Interaction (FSI) in a flexible tank by Coupling between the Transient Structural (Mechanical) and Fluid Flow (Fluent) in ANSYS-Workbench Student version. The Arbitrary Lagrangian-Eulerian (ALE) formulation is adopted to solve with the finite volume method, the Navier-Stokes equations in two phases in a moving domain. The Volume of Fluid (VOF) method is applied to track the free surface. However, the equations of the dynamics of the structure are solved with the finite element method assuming a linear elastic behavior. To conclude, the Fluid-Structure Interaction (IFS) has a vital role in the analysis of the dynamic behavior of the rectangular tank. The results indicate that the flexibility of the tank walls has a significant impact on the amplitude of tank sloshing and the deformation of the free surface as well as the effect of liquid sloshing on wall deformation.Keywords: arbitrary lagrangian-eulerian, fluid-structure interaction, sloshing, volume of fluid
Procedia PDF Downloads 105915 Knowledge Reactor: A Contextual Computing Work in Progress for Eldercare
Authors: Scott N. Gerard, Aliza Heching, Susann M. Keohane, Samuel S. Adams
Abstract:
The world-wide population of people over 60 years of age is growing rapidly. The explosion is placing increasingly onerous demands on individual families, multiple industries and entire countries. Current, human-intensive approaches to eldercare are not sustainable, but IoT and AI technologies can help. The Knowledge Reactor (KR) is a contextual, data fusion engine built to address this and other similar problems. It fuses and centralizes IoT and System of Record/Engagement data into a reactive knowledge graph. Cognitive applications and services are constructed with its multiagent architecture. The KR can scale-up and scaledown, because it exploits container-based, horizontally scalable services for graph store (JanusGraph) and pub-sub (Kafka) technologies. While the KR can be applied to many domains that require IoT and AI technologies, this paper describes how the KR specifically supports the challenging domain of cognitive eldercare. Rule- and machine learning-based analytics infer activities of daily living from IoT sensor readings. KR scalability, adaptability, flexibility and usability are demonstrated.Keywords: ambient sensing, AI, artificial intelligence, eldercare, IoT, internet of things, knowledge graph
Procedia PDF Downloads 175914 Estimating Occupancy in Residential Context Using Bayesian Networks for Energy Management
Authors: Manar Amayri, Hussain Kazimi, Quoc-Dung Ngo, Stephane Ploix
Abstract:
A general approach is proposed to determine occupant behavior (occupancy and activity) in residential buildings and to use these estimates for improved energy management. Occupant behaviour is modelled with a Bayesian Network in an unsupervised manner. This algorithm makes use of domain knowledge gathered via questionnaires and recorded sensor data for motion detection, power, and hot water consumption as well as indoor CO₂ concentration. Two case studies are presented which show the real world applicability of estimating occupant behaviour in this way. Furthermore, experiments integrating occupancy estimation and hot water production control show that energy efficiency can be increased by roughly 5% over known optimal control techniques and more than 25% over rule-based control while maintaining the same occupant comfort standards. The efficiency gains are strongly correlated with occupant behaviour and accuracy of the occupancy estimates.Keywords: energy, management, control, optimization, Bayesian methods, learning theory, sensor networks, knowledge modelling and knowledge based systems, artificial intelligence, buildings
Procedia PDF Downloads 370913 Modeling of Transformer Winding for Transients: Frequency-Dependent Proximity and Skin Analysis
Authors: Yazid Alkraimeen
Abstract:
Precise prediction of dielectric stresses and high voltages of power transformers require the accurate calculation of frequency-dependent parameters. A lack of accuracy can result in severe damages to transformer windings. Transient conditions is stuided by digital computers, which require the implementation of accurate models. This paper analyzes the computation of frequency-dependent skin and proximity losses included in the transformer winding model, using analytical equations and Finite Element Method (FEM). A modified formula to calculate the proximity and the skin losses is presented. The results of the frequency-dependent parameter calculations are verified using the Finite Element Method. The time-domain transient voltages are obtained using Numerical Inverse Laplace Transform. The results show that the classical formula for proximity losses is overestimating the transient voltages when compared with the results obtained from the modified method on a simple transformer geometry.Keywords: fast front transients, proximity losses, transformer winding modeling, skin losses
Procedia PDF Downloads 139912 Collaborative and Context-Aware Learning Approach Using Mobile Technology
Authors: Sameh Baccari, Mahmoud Neji
Abstract:
In recent years, the rapid developments on mobile devices and wireless technologies enable new dimension capabilities for the learning domain. This dimension facilitates people daily activities and shortens the distances between individuals. When these technologies have been used in learning, a new paradigm has been emerged giving birth to mobile learning. Because of the mobility feature, m-learning courses have to be adapted dynamically to the learner’s context. The main challenge in context-aware mobile learning is to develop an approach building the best learning resources according to dynamic learning situations. In this paper, we propose a context-aware mobile learning system called Collaborative and Context-aware Mobile Learning System (CCMLS). It takes into account the requirements of Mobility, Collaboration and Context-Awareness. This system is based on the semantic modeling of the learning context and the learning content. The adaptation part of this approach is made up of adaptation rules to propose and select relevant resources, learning partners and learning activities based not only on the user’s needs, but also on its current context.Keywords: mobile learning, mobile technologies, context-awareness, collaboration, semantic web, adaptation engine, adaptation strategy, learning object, learning context
Procedia PDF Downloads 308911 Product Features Extraction from Opinions According to Time
Authors: Kamal Amarouche, Houda Benbrahim, Ismail Kassou
Abstract:
Nowadays, e-commerce shopping websites have experienced noticeable growth. These websites have gained consumers’ trust. After purchasing a product, many consumers share comments where opinions are usually embedded about the given product. Research on the automatic management of opinions that gives suggestions to potential consumers and portrays an image of the product to manufactures has been growing recently. After launching the product in the market, the reviews generated around it do not usually contain helpful information or generic opinions about this product (e.g. telephone: great phone...); in the sense that the product is still in the launching phase in the market. Within time, the product becomes old. Therefore, consumers perceive the advantages/ disadvantages about each specific product feature. Therefore, they will generate comments that contain their sentiments about these features. In this paper, we present an unsupervised method to extract different product features hidden in the opinions which influence its purchase, and that combines Time Weighting (TW) which depends on the time opinions were expressed with Term Frequency-Inverse Document Frequency (TF-IDF). We conduct several experiments using two different datasets about cell phones and hotels. The results show the effectiveness of our automatic feature extraction, as well as its domain independent characteristic.Keywords: opinion mining, product feature extraction, sentiment analysis, SentiWordNet
Procedia PDF Downloads 410910 Cognitive eTransformation Framework for Education Sector
Authors: A. Hol
Abstract:
21st century brought waves of business and industry eTransformations. The impact of change is also being seen in education. To identify the extent of this, scenario analysis methodology was utilised with the aim to assess business transformations across industry sectors ranging from craftsmanship, medicine, finance and manufacture to innovations and adoptions of new technologies and business models. Firstly, scenarios were drafted based on the current eTransformation models and its dimensions. Following this, eTransformation framework was utilised with the aim to derive the key eTransformation parameters, the essential characteristics that have enabled eTransformations across the sectors. Following this, identified key parameters were mapped to the transforming domain-education. The mapping assisted in deriving a cognitive eTransformation framework for education sector. The framework highlights the importance of context and the notion that education today needs not only to deliver content to students but it also needs to be able to meet the dynamically changing demands of specific student and industry groups. Furthermore, it pinpoints that for such processes to be supported, specific technology is required, so that instant, on demand and periodic feedback as well as flexible, dynamically expanding study content can be sought and received via multiple education mediums.Keywords: education sector, business transformation, eTransformation model, cognitive model, cognitive systems, eTransformation
Procedia PDF Downloads 136909 Genetically Encoded Tool with Time-Resolved Fluorescence Readout for the Calcium Concentration Measurement
Authors: Tatiana R. Simonyan, Elena A. Protasova, Anastasia V. Mamontova, Eugene G. Maksimov, Konstantin A. Lukyanov, Alexey M. Bogdanov
Abstract:
Here, we describe two variants of the calcium indicators based on the GCaMP sensitive core and BrUSLEE fluorescent protein (GCaMP-BrUSLEE and GCaMP-BrUSLEE-145). In contrast to the conventional GCaMP6-family indicators, these fluorophores are characterized by the well-marked responsiveness of their fluorescence decay kinetics to external calcium concentration both in vitro and in cellulo. Specifically, we show that the purified GCaMP-BrUSLEE and GCaMP-BrUSLEE-145 exhibit three-component fluorescence decay kinetics, with the amplitude-normalized lifetime component (t3*A3) of GCaMP-BrUSLEE-145 changing four-fold (500-2000 a.u.) in response to a Ca²⁺ concentration shift in the range of 0—350 nM. Time-resolved fluorescence microscopy of live cells displays the two-fold change of the GCaMP-BrUSLEE-145 mean lifetime upon histamine-stimulated calcium release. The aforementioned Ca²⁺-dependence calls considering the GCaMP-BrUSLEE-145 as a prospective Ca²⁺-indicator with the signal read-out in the time domain.Keywords: calcium imaging, fluorescence lifetime imaging microscopy, fluorescent proteins, genetically encoded indicators
Procedia PDF Downloads 158908 Interior Design Pedagogy in the 21st Century: Personalised Design Process
Authors: Roba Zakariah Shaheen
Abstract:
In the 21st-century Interior, design pedagogy has developed rapidly due to social and economical factors. Socially, this paper presents research findings that shows a significant relationship between educators and students in interior design education. It shows that students’ personal traits, design process, and thinking process are significantly interrelated. Constructively, this paper presented how personal traits can guide educators in the interior design education domain to develop students’ thinking process. In the same time, it demonstrated how students should use their own personal traits to create their own design process. Constructivism was the theory underneath this research, as it supports the grounded theory, which is the methodological approach of this research. Moreover, Mayer’s Briggs Type Indicator strategy was used to investigate the personality traits scientifically, as a psychological strategy that related to cognitive ability. Conclusions from this research strongly recommends that educators and students should utilize their personal traits to foster interior design education.Keywords: interior design, pedagogy, constructivism, grounded theory, personality traits, creativity
Procedia PDF Downloads 207907 Electromyography Pattern Classification with Laplacian Eigenmaps in Human Running
Authors: Elnaz Lashgari, Emel Demircan
Abstract:
Electromyography (EMG) is one of the most important interfaces between humans and robots for rehabilitation. Decoding this signal helps to recognize muscle activation and converts it into smooth motion for the robots. Detecting each muscle’s pattern during walking and running is vital for improving the quality of a patient’s life. In this study, EMG data from 10 muscles in 10 subjects at 4 different speeds were analyzed. EMG signals are nonlinear with high dimensionality. To deal with this challenge, we extracted some features in time-frequency domain and used manifold learning and Laplacian Eigenmaps algorithm to find the intrinsic features that represent data in low-dimensional space. We then used the Bayesian classifier to identify various patterns of EMG signals for different muscles across a range of running speeds. The best result for vastus medialis muscle corresponds to 97.87±0.69 for sensitivity and 88.37±0.79 for specificity with 97.07±0.29 accuracy using Bayesian classifier. The results of this study provide important insight into human movement and its application for robotics research.Keywords: electromyography, manifold learning, ISOMAP, Laplacian Eigenmaps, locally linear embedding
Procedia PDF Downloads 361906 Complex Cooling Approach in Microchannel Heat Exchangers Using Solid and Hollow Fins
Authors: Nahum Yustus Godi
Abstract:
A three-dimensional numerical optimisation of combined microchannels with constructal solid, half hollow, and hollow circular fins is documented in this paper. The technique seeks to minimize peak temperature in the entire volume of the microchannel heat sink. The volume and axial length were all fixed, while the width of the microchannel could morph. High-density heat flux was applied at the bottom wall of the microchannel. The coolant employed to remove the heat deposited at the bottom surface of the microchannel was a single-phase fluid (water) in a forced convection laminar condition, and heat transfer was a conjugate problem. The unit cell symmetrical computation domain was discretised, and governing equations were solved using computational fluid dynamic (CFD) code. The results reveal that the combined microchannel with hollow circular fins and solid fins performed better at different Reynolds numbers. The numerical study was validated for the single microchannel without fins and found to be in good agreement with previous studies.Keywords: constructal fins, complex heat exchangers, cooling technique, numerical optimisation
Procedia PDF Downloads 225905 Electrical Equivalent Analysis of Micro Cantilever Beams for Sensing Applications
Authors: B. G. Sheeparamatti, J. S. Kadadevarmath
Abstract:
Microcantilevers are the basic MEMS devices, which can be used as sensors, actuators, and electronics can be easily built into them. The detection principle of microcantilever sensors is based on the measurement of change in cantilever deflection or change in its resonance frequency. The objective of this work is to explore the analogies between the mechanical and electrical equivalent of microcantilever beams. Normally scientists and engineers working in MEMS use expensive software like CoventorWare, IntelliSuite, ANSYS/Multiphysics, etc. This paper indicates the need of developing the electrical equivalent of the MEMS structure and with that, one can have a better insight on important parameters, and their interrelation of the MEMS structure. In this work, considering the mechanical model of the microcantilever, the equivalent electrical circuit is drawn and using a force-voltage analogy, it is analyzed with circuit simulation software. By doing so, one can gain access to a powerful set of intellectual tools that have been developed for understanding electrical circuits. Later the analysis is performed using ANSYS/Multiphysics - software based on finite element method (FEM). It is observed that both mechanical and electrical domain results for a rectangular microcantilevers are in agreement with each other.Keywords: electrical equivalent circuit analogy, FEM analysis, micro cantilevers, micro sensors
Procedia PDF Downloads 397904 1D Convolutional Networks to Compute Mel-Spectrogram, Chromagram, and Cochleogram for Audio Networks
Authors: Elias Nemer, Greg Vines
Abstract:
Time-frequency transformation and spectral representations of audio signals are commonly used in various machine learning applications. Training networks on frequency features such as the Mel-Spectrogram or Cochleogram have been proven more effective and convenient than training on-time samples. In practical realizations, these features are created on a different processor and/or pre-computed and stored on disk, requiring additional efforts and making it difficult to experiment with different features. In this paper, we provide a PyTorch framework for creating various spectral features as well as time-frequency transformation and time-domain filter-banks using the built-in trainable conv1d() layer. This allows computing these features on the fly as part of a larger network and enabling easier experimentation with various combinations and parameters. Our work extends the work in the literature developed for that end: First, by adding more of these features and also by allowing the possibility of either starting from initialized kernels or training them from random values. The code is written as a template of classes and scripts that users may integrate into their own PyTorch classes or simply use as is and add more layers for various applications.Keywords: neural networks Mel-Spectrogram, chromagram, cochleogram, discrete Fourrier transform, PyTorch conv1d()
Procedia PDF Downloads 233903 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ
Authors: M. Khaled Abduesslam, Mohammed Ali, Basher H. Alsdai, Muhammad Nizam Inayati
Abstract:
This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New-England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.Keywords: IEEE 39 bus, least squares support vector machine, learning vector quantization, voltage collapse
Procedia PDF Downloads 441902 A Model to Assist Military Mission Planners in Identifying and Assessing Variables Impacting Food Security
Authors: Lynndee Kemmet
Abstract:
The U.S. military plays an increasing role in supporting political stability efforts, and this includes efforts to prevent the food insecurity that can trigger political and social instability. This paper presents a model that assists military commanders in identifying variables that impact food production and distribution in their areas of operation (AO), in identifying connections between variables and in assessing the impacts of those variables on food production and distribution. Through use of the model, military units can better target their data collection efforts and can categorize and analyze data within the data categorization framework most widely-used by military forces—PMESII-PT (Political, Military, Economic, Infrastructure, Information, Physical Environment and Time). The model provides flexibility of analysis in that commanders can target analysis to be highly focused on a specific PMESII-PT domain or variable or conduct analysis across multiple PMESII-PT domains. The model is also designed to assist commanders in mapping food systems in their AOs and then identifying components of those systems that must be strengthened or protected.Keywords: food security, food system model, political stability, US Military
Procedia PDF Downloads 195901 Reusing Assessments Tests by Generating Arborescent Test Groups Using a Genetic Algorithm
Authors: Ovidiu Domşa, Nicolae Bold
Abstract:
Using Information and Communication Technologies (ICT) notions in education and three basic processes of education (teaching, learning and assessment) can bring benefits to the pupils and the professional development of teachers. In this matter, we refer to these notions as concepts taken from the informatics area and apply them to the domain of education. These notions refer to genetic algorithms and arborescent structures, used in the specific process of assessment or evaluation. This paper uses these kinds of notions to generate subtrees from a main tree of tests related between them by their degree of difficulty. These subtrees must contain the highest number of connections between the nodes and the lowest number of missing edges (which are subtrees of the main tree) and, in the particular case of the non-existence of a subtree with no missing edges, the subtrees which have the lowest (minimal) number of missing edges between the nodes, where a node is a test and an edge is a direct connection between two tests which differs by one degree of difficulty. The subtrees are represented as sequences. The tests are the same (a number coding a test represents that test in every sequence) and they are reused for each sequence of tests.Keywords: chromosome, genetic algorithm, subtree, test
Procedia PDF Downloads 324900 Digital Image Steganography with Multilayer Security
Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal
Abstract:
In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix
Procedia PDF Downloads 337899 A Predictive MOC Solver for Water Hammer Waves Distribution in Network
Authors: A. Bayle, F. Plouraboué
Abstract:
Water Distribution Network (WDN) still suffers from a lack of knowledge about fast pressure transient events prediction, although the latter may considerably impact their durability. Accidental or planned operating activities indeed give rise to complex pressure interactions and may drastically modified the local pressure value generating leaks and, in rare cases, pipe’s break. In this context, a numerical predictive analysis is conducted to prevent such event and optimize network management. A couple of Python/FORTRAN 90, home-made software, has been developed using Method Of Characteristic (MOC) solving for water-hammer equations. The solver is validated by direct comparison with theoretical and experimental measurement in simple configurations whilst afterward extended to network analysis. The algorithm's most costly steps are designed for parallel computation. A various set of boundary conditions and energetic losses models are considered for the network simulations. The results are analyzed in both real and frequencies domain and provide crucial information on the pressure distribution behavior within the network.Keywords: energetic losses models, method of characteristic, numerical predictive analysis, water distribution network, water hammer
Procedia PDF Downloads 232898 Numerical Analysis of Laminar Flow around Square Cylinders with EHD Phenomenon
Authors: M. Salmanpour, O. Nourani Zonouz
Abstract:
In this research, a numerical simulation of an Electrohydrodynamic (EHD) actuator’s effects on the flow around a square cylinder by using a finite volume method has been investigated. This is one of the newest ways for controlling the fluid flows. Two plate electrodes are flush-mounted on the surface of the cylinder and one wire electrode is placed on the line with zero angle of attack relative to the stagnation point and excited with DC power supply. The discharge produces an electric force and changes the local momentum behaviors in the fluid layers. For this purpose, after selecting proper domain and boundary conditions, the electric field relating to the problem has been analyzed and then the results in the form of electrical body force have been entered in the governing equations of fluid field (Navier-Stokes equations). The effect of ionic wind resulted from the Electrohydrodynamic actuator, on the velocity, pressure and the wake behind cylinder has been considered. According to the results, it is observed that the fluid flow accelerates in the nearest wall of the frontal half of the cylinder and the pressure difference between frontal and hinder cylinder is increased.Keywords: CFD, corona discharge, electro hydrodynamics, flow around square cylinders, simulation
Procedia PDF Downloads 471