Search results for: piezo diaphragm based actuator
27309 Multimodal Deep Learning for Human Activity Recognition
Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja
Abstract:
In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness
Procedia PDF Downloads 10127308 Research on Urban Thermal Environment Climate Map Based on GIS: Taking Shapingba District, Chongqing as an Example
Authors: Zhao Haoyue
Abstract:
Due to the combined effects of climate change, urban expansion, and population growth, various environmental issues, such as urban heat islands and pollution, arise. Therefore, reliable information on urban environmental climate is needed to address and mitigate the negative effects. The emergence of urban climate maps provides a practical basis for urban climate regulation and improvement. This article takes Shapingba District, Chongqing City, as an example to study the construction method of urban thermal environment climate maps based on GIS spatial analysis technology. The thermal load, ventilation potential analysis map, and thermal environment comprehensive analysis map were obtained. Based on the classification criteria obtained from the climate map, corresponding protection and planning mitigation measures have been proposed.Keywords: urban climate, GIS, heat island analysis, urban thermal environment
Procedia PDF Downloads 11327307 Expression-Based Learning as a Starting Point to Promote Students’ Creativity in K-12 Schools in China
Authors: Yanyue Yuan
Abstract:
In this paper, the author shares the findings of a pilot study that examines students’ creative expressions and their perceptions of creativity when engaged in project-based learning. The study is based on an elective course that the author co-designed and co-taught with a colleague to sixteen grade six and seven students over the spring semester in 2019. Using the Little Prince story as the main prompt, they facilitated students’ original creation of a storytelling concert that integrated script writing, music production, lyrics, songs, and visual design as a result of both individual and collaborative work. The author will share the specific challenges we met during the project, including learning cultures of the school, class management, teachers' and parents’ attitude, process-oriented versus product-oriented mindset, and facilities and logistical resources. The findings of this pilot study will inform the ongoing research initiative of exploring how we can foster creative learning in public schools in the Chinese context. While K-12 schools of China’s public education system are still dominated by exam-oriented and teacher-centered approaches, the author proposes that expression-based learning can be a starting point for promoting students’ creativity and can serve as experimental efforts to initiate incremental changes within the current education framework. The paper will also touch upon insights gained from collaborations between university and K-12 schools.Keywords: creativity, expression-based learning, K-12, incremental changes
Procedia PDF Downloads 10327306 Decision Trees Constructing Based on K-Means Clustering Algorithm
Authors: Loai Abdallah, Malik Yousef
Abstract:
A domain space for the data should reflect the actual similarity between objects. Since objects belonging to the same cluster usually share some common traits even though their geometric distance might be relatively large. In general, the Euclidean distance of data points that represented by large number of features is not capturing the actual relation between those points. In this study, we propose a new method to construct a different space that is based on clustering to form a new distance metric. The new distance space is based on ensemble clustering (EC). The EC distance space is defined by tracking the membership of the points over multiple runs of clustering algorithm metric. Over this distance, we train the decision trees classifier (DT-EC). The results obtained by applying DT-EC on 10 datasets confirm our hypotheses that embedding the EC space as a distance metric would improve the performance.Keywords: ensemble clustering, decision trees, classification, K nearest neighbors
Procedia PDF Downloads 19127305 Deployment of Electronic Healthcare Records and Development of Big Data Analytics Capabilities in the Healthcare Industry: A Systematic Literature Review
Authors: Tigabu Dagne Akal
Abstract:
Electronic health records (EHRs) can help to store, maintain, and make the appropriate handling of patient histories for proper treatment and decision. Merging the EHRs with big data analytics (BDA) capabilities enable healthcare stakeholders to provide effective and efficient treatments for chronic diseases. Though there are huge opportunities and efforts that exist in the deployment of EMRs and the development of BDA, there are challenges in addressing resources and organizational capabilities that are required to achieve the competitive advantage and sustainability of EHRs and BDA. The resource-based view (RBV), information system (IS), and non- IS theories should be extended to examine organizational capabilities and resources which are required for successful data analytics in the healthcare industries. The main purpose of this study is to develop a conceptual framework for the development of healthcare BDA capabilities based on past works so that researchers can extend. The research question was formulated for the search strategy as a research methodology. The study selection was made at the end. Based on the study selection, the conceptual framework for the development of BDA capabilities in the healthcare settings was formulated.Keywords: EHR, EMR, Big data, Big data analytics, resource-based view
Procedia PDF Downloads 13127304 Rheological Characterization of Gels Based on Medicinal Plant Extracts Mixture (Zingibar Officinale and Cinnamomum Cassia)
Authors: Zahia Aliche, Fatiha Boudjema, Benyoucef Khelidj, Selma Mettai, Zohra Bouriahi, Saliha Mohammed Belkebir, Ridha Mazouz
Abstract:
The purpose of this work is the study of the viscoelastic behaviour formulating gels based plant extractions. The extracts of Zingibar officinale and Cinnamomum cassia were included in the gel at different concentrations of these plants in order to be applied in anti-inflammatory drugs. The yield of ethanolic extraction of Zingibar o. is 3.98% and for Cinnamomum c., essential oil by hydrodistillation is 1.67 %. The ethanolic extract of Zingibar.o, the essential oil of Cinnamomum c. and the mixture showed an anti-DPPH radicals’ activity, presented by EC50 values of 11.32, 13.48 and 14.39 mg/ml respectively. A gel based on different concentrations of these extracts was prepared. Microbiological tests conducted against Staphylococcus aureus and Escherichia colishowed moderate inhibition of Cinnamomum c. gel and less the gel based on Cinnamomum c./ Zingibar o. (20/80). The yeast Candida albicansis resistant to gels. The viscoelastic formulation property was carried out in dynamic and creep and modeled with the Kelvin-Voigt model. The influence of some parameters on the stability of the gel (time, temperature and applied stress) has been studied.Keywords: Cinnamomum cassia, Zingibar officinale, antioxidant activity, antimicrobien activity, gel, viscoelastic behaviour
Procedia PDF Downloads 8927303 Development of a Context Specific Planning Model for Achieving a Sustainable Urban City
Authors: Jothilakshmy Nagammal
Abstract:
This research paper deals with the different case studies, where the Form-Based Codes are adopted in general and the different implementation methods in particular are discussed to develop a method for formulating a new planning model. The organizing principle of the Form-Based Codes, the transect is used to zone the city into various context specific transects. An approach is adopted to develop the new planning model, city Specific Planning Model (CSPM), as a tool to achieve sustainability for any city in general. A case study comparison method in terms of the planning tools used, the code process adopted and the various control regulations implemented in thirty two different cities are done. The analysis shows that there are a variety of ways to implement form-based zoning concepts: Specific plans, a parallel or optional form-based code, transect-based code /smart code, required form-based standards or design guidelines. The case studies describe the positive and negative results from based zoning, Where it is implemented. From the different case studies on the method of the FBC, it is understood that the scale for formulating the Form-Based Code varies from parts of the city to the whole city. The regulating plan is prepared with the organizing principle as the transect in most of the cases. The various implementation methods adopted in these case studies for the formulation of Form-Based Codes are special districts like the Transit Oriented Development (TOD), traditional Neighbourhood Development (TND), specific plan and Street based. The implementation methods vary from mandatory, integrated and floating. To attain sustainability the research takes the approach of developing a regulating plan, using the transect as the organizing principle for the entire area of the city in general in formulating the Form-Based Codes for the selected Special Districts in the study area in specific, street based. Planning is most powerful when it is embedded in the broader context of systemic change and improvement. Systemic is best thought of as holistic, contextualized and stake holder-owned, While systematic can be thought of more as linear, generalisable, and typically top-down or expert driven. The systemic approach is a process that is based on the system theory and system design principles, which are too often ill understood by the general population and policy makers. The system theory embraces the importance of a global perspective, multiple components, interdependencies and interconnections in any system. In addition, the recognition that a change in one part of a system necessarily alters the rest of the system is a cornerstone of the system theory. The proposed regulating plan taking the transect as an organizing principle and Form-Based Codes to achieve sustainability of the city has to be a hybrid code, which is to be integrated within the existing system - A Systemic Approach with a Systematic Process. This approach of introducing a few form based zones into a conventional code could be effective in the phased replacement of an existing code. It could also be an effective way of responding to the near-term pressure of physical change in “sensitive” areas of the community. With this approach and method the new Context Specific Planning Model is created towards achieving sustainability is explained in detail this research paper.Keywords: context based planning model, form based code, transect, systemic approach
Procedia PDF Downloads 33827302 Heart-Rate Resistance Electrocardiogram Identification Based on Slope-Oriented Neural Networks
Authors: Tsu-Wang Shen, Shan-Chun Chang, Chih-Hsien Wang, Te-Chao Fang
Abstract:
For electrocardiogram (ECG) biometrics system, it is a tedious process to pre-install user’s high-intensity heart rate (HR) templates in ECG biometric systems. Based on only resting enrollment templates, it is a challenge to identify human by using ECG with the high-intensity HR caused from exercises and stress. This research provides a heartbeat segment method with slope-oriented neural networks against the ECG morphology changes due to high intensity HRs. The method has overall system accuracy at 97.73% which includes six levels of HR intensities. A cumulative match characteristic curve is also used to compare with other traditional ECG biometric methods.Keywords: high-intensity heart rate, heart rate resistant, ECG human identification, decision based artificial neural network
Procedia PDF Downloads 43527301 PTFE Capillary-Based DNA Amplification within an Oscillatory Thermal Cycling Device
Authors: Jyh J. Chen, Fu H. Yang, Ming H. Liao
Abstract:
This study describes a capillary-based device integrated with the heating and cooling modules for polymerase chain reaction (PCR). The device consists of the reaction polytetrafluoroethylene (PTFE) capillary, the aluminum blocks, and is equipped with two cartridge heaters, a thermoelectric (TE) cooler, a fan, and some thermocouples for temperature control. The cartridge heaters are placed into the heating blocks and maintained at two different temperatures to achieve the denaturation and the extension step. Some thermocouples inserted into the capillary are used to obtain the transient temperature profiles of the reaction sample during thermal cycles. A 483-bp DNA template is amplified successfully in the designed system and the traditional thermal cycler. This work should be interesting to persons involved in the high-temperature based reactions and genomics or cell analysis.Keywords: polymerase chain reaction, thermal cycles, capillary, TE cooler
Procedia PDF Downloads 45527300 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis
Authors: Hyun-Woo Cho
Abstract:
Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques
Procedia PDF Downloads 38727299 Pet Bearing Bio-Based Moities
Authors: Majdi Abid
Abstract:
During the last few decades, great efforts have been made for the development of innovative materials using vegetal biomass. This strategy is understandable for different reasons including the predictable dwindling of the petrochemical feedstock and their price increase as well as the counterbalancing of the environmental problems. As novel bio-based monomers used in polyesters synthesis, two families, namely 1,4:3,6-dianhydrohexitols and furanics were prepared for saccharidic renewable resources. The present work deals with a detail investigation on the synthesis of poly(ethylene-co-isosorbide terephthalate-co-furoate) (PEITF) by melt polycondensation of dimethylterephtalate (DMT), 5,5’-isopropylidene-bis (ethyl 2-furoate) (DEF) ethan-1,2-diol (ED) and isosorbide (IS). Polycondensationwas achieved in two steps: (i) the formation of a hydroxyethylterminated oligomer by reaction of starting diester mixture with excess ED and, (ii) a polycondensation step with elimination of ED was used to obtain high molar mass copolyesters. Copolymers of various compositions were synthesized and characterized by 1H NMR, SEC, DSC and TGA. The resulting materials are amorphous polymers (Tg = 104–127 °C) with good thermal stability.Keywords: bio-based monomers, furan, isosrbide, polycondensation
Procedia PDF Downloads 28427298 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software
Authors: Anjushi Verma, Tirthankar Gayen
Abstract:
Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.Keywords: black box, faults, failure, software reliability
Procedia PDF Downloads 44327297 Extent of Applying Evidence Based Practices in Inclusion Programs for Pupils with Intellectual Disability
Authors: Faris Algahtani
Abstract:
The current study aimed to reveal the extent to which evidence-based practices are applied in programs to integrate students with intellectual disabilities from the point of view of their teachers in Yanbu Governorate, and to reveal statistically significant differences in their application of evidence-based practices according to the following variables: gender, educational qualification, experience and training courses. The researcher used the descriptive approach, and accordingly; she designed a questionnaire consisting of 22 phrases applied it to a random sample of (97) teachers of intellectual disability in the integration programs of the Ministry of Education in the government sector in Yanbu Governorate, with (49) male teachers and (48) female teachers. The study showed that teachers of students with intellectual disabilities apply evidence-based practices in programs to integrate students with intellectual disabilities to a large extent. Among the most prominent of these practices came reinforcement in the first place, followed by using visual stimuli/aids, and in the third-place came starting with less complex or challenging skills then moving to more difficult skills. The results also showed no statistically significant differences over the extent of the application attributed to the variables of experience, qualification or training. On the other hand, there were statistically significant differences over the extent of the application attributed to gender in favor of females.Keywords: evidence-based practices, intellectual disability, inclusion programs, teachers of students with intellectual disabilities
Procedia PDF Downloads 8927296 Proteomic Analysis of Excretory Secretory Antigen (ESA) from Entamoeba histolytica HM1: IMSS
Authors: N. Othman, J. Ujang, M. N. Ismail, R. Noordin, B. H. Lim
Abstract:
Amoebiasis is caused by the Entamoeba histolytica and still endemic in many parts of the tropical region, worldwide. Currently, there is no available vaccine against amoebiasis. Hence, there is an urgent need to develop a vaccine. The excretory secretory antigen (ESA) of E. histolytica is a suitable biomarker for the vaccine candidate since it can modulate the host immune response. Hence, the objective of this study is to identify the proteome of the ESA towards finding suitable biomarker for the vaccine candidate. The non-gel based and gel-based proteomics analyses were performed to identify proteins. Two kinds of mass spectrometry with different ionization systems were utilized i.e. LC-MS/MS (ESI) and MALDI-TOF/TOF. Then, the functional proteins classification analysis was performed using PANTHER software. Combination of the LC -MS/MS for the non-gel based and MALDI-TOF/TOF for the gel-based approaches identified a total of 273 proteins from the ESA. Both systems identified 29 similar proteins whereby 239 and 5 more proteins were identified by LC-MS/MS and MALDI-TOF/TOF, respectively. Functional classification analysis showed the majority of proteins involved in the metabolic process (24%), primary metabolic process (19%) and protein metabolic process (10%). Thus, this study has revealed the proteome the E. histolytica ESA and the identified proteins merit further investigations as a vaccine candidate.Keywords: E. histolytica, ESA, proteomics, biomarker
Procedia PDF Downloads 34427295 Multi-Walled Carbon Nanotube Based Water Filter for Virus Pathogen Removal
Authors: K. Domagala, D. Kata, T. Graule
Abstract:
Diseases caused by contaminated drinking water are the worldwide problem, which leads to the death and severe illnesses for hundreds of millions million people each year. There is an urgent need for efficient water treatment techniques for virus pathogens removal. The aim of the research was to develop safe and economic solution, which help with the water treatment. In this study, the synthesis of copper-based multi-walled carbon nanotube composites is described. Proposed solution utilize combination of a low-cost material with a high active surface area and copper antiviral properties. Removal of viruses from water was possible by adsorption based on electrostatic interactions of negatively charged virus with a positively charged filter material.Keywords: multi walled carbon nanotubes, water purification, virus removal, water treatment
Procedia PDF Downloads 13127294 A Case-Based Reasoning-Decision Tree Hybrid System for Stock Selection
Authors: Yaojun Wang, Yaoqing Wang
Abstract:
Stock selection is an important decision-making problem. Many machine learning and data mining technologies are employed to build automatic stock-selection system. A profitable stock-selection system should consider the stock’s investment value and the market timing. In this paper, we present a hybrid system including both engage for stock selection. This system uses a case-based reasoning (CBR) model to execute the stock classification, uses a decision-tree model to help with market timing and stock selection. The experiments show that the performance of this hybrid system is better than that of other techniques regarding to the classification accuracy, the average return and the Sharpe ratio.Keywords: case-based reasoning, decision tree, stock selection, machine learning
Procedia PDF Downloads 42027293 Methods of Improving Production Processes Based on Deming Cycle
Authors: Daniel Tochwin
Abstract:
Continuous improvement is an essential part of effective process performance management. In order to achieve continuous quality improvement, each organization must use the appropriate selection of tools and techniques. The basic condition for success is a proper understanding of the business need faced by the company and the selection of appropriate methods to improve a given production process. The main aim of this article is to analyze the methods of conduct which are popular in practice when implementing process improvements and then to determine whether the tested methods include repetitive systematics of the approach, i.e., a similar sequence of the same or similar actions. Based on an extensive literature review, 4 methods of continuous improvement of production processes were selected: A3 report, Gemba Kaizen, PDCA cycle, and Deming cycle. The research shows that all frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re)interpretation" and the need to adapt the continuous improvement approach to the specific business process. The research shows that all the frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re) interpretation" and the need to adapt the continuous improvement approach to the specific business process.Keywords: continuous improvement, lean methods, process improvement, PDCA
Procedia PDF Downloads 8027292 Cost-Effective Hybrid Cloud Framework for HEI’s
Authors: Shah Muhammad Butt, Ahmed Masaud Ansari
Abstract:
Present Financial crisis in Higher Educational Institutes (HEIs) facing lots of problems considerable budget cuts, make difficult to meet the ever growing IT-based research and learning needs, institutions are rapidly planning and promoting cloud-based approaches for their academic and research needs. A cost effective Hybrid Cloud framework for HEI’s will provide educational services for campus or intercampus communication. Hybrid Cloud Framework comprises Private and Public Cloud approaches. This paper will propose the framework based on the Open Source Cloud (OpenNebula for Virtualization, Eucalyptus for Infrastructure, and Aneka for programming development environment) combined with CSP’s services which are delivered to the end-user via the Internet from public clouds.Keywords: educational services, hybrid campus cloud, open source, electrical and systems sciences
Procedia PDF Downloads 45827291 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models
Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini
Abstract:
The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion
Procedia PDF Downloads 14027290 Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall
Authors: C. J. W. Habets, D. J. Peters, J. G. de Gijt, A. V. Metrikine, S. N. Jonkman
Abstract:
Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-displacement and finite element analysis are employed, calibrated with an experimental reference case that considers a typical anchored sheet pile wall. A reduction factor that accounts for deformation behaviour is determined for pseudo-static analysis. A model to apply traditional permanent displacement analysis on anchored sheet pile walls is proposed. Dynamic analysis is successfully carried out. From the research it is concluded that PBD evaluation can effectively be used for seismic analysis and design of this type of structure.Keywords: anchored sheet pile quay wall, simplified dynamic analysis, performance-based design, pseudo-static analysis
Procedia PDF Downloads 37927289 Early Detection of Breast Cancer in Digital Mammograms Based on Image Processing and Artificial Intelligence
Authors: Sehreen Moorat, Mussarat Lakho
Abstract:
A method of artificial intelligence using digital mammograms data has been proposed in this paper for detection of breast cancer. Many researchers have developed techniques for the early detection of breast cancer; the early diagnosis helps to save many lives. The detection of breast cancer through mammography is effective method which detects the cancer before it is felt and increases the survival rate. In this paper, we have purposed image processing technique for enhancing the image to detect the graphical table data and markings. Texture features based on Gray-Level Co-Occurrence Matrix and intensity based features are extracted from the selected region. For classification purpose, neural network based supervised classifier system has been used which can discriminate between benign and malignant. Hence, 68 digital mammograms have been used to train the classifier. The obtained result proved that automated detection of breast cancer is beneficial for early diagnosis and increases the survival rates of breast cancer patients. The proposed system will help radiologist in the better interpretation of breast cancer.Keywords: medical imaging, cancer, processing, neural network
Procedia PDF Downloads 25927288 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15827287 Self-Regulated Learning: A Required Skill for Web 2.0 Internet-Based Learning
Authors: Pieter Conradie, M. Marina Moller
Abstract:
Web 2.0 Internet-based technologies have intruded all aspects of human life. Presently, this phenomenon is especially evident in the educational context, with increased disruptive Web 2.0 technology infusions dramatically changing educational practice. The most prominent of these Web 2.0 intrusions can be identified as Massive Open Online Courses (Coursera, EdX), video and photo sharing sites (Youtube, Flickr, Instagram), and Web 2.0 online tools utilize to create Personal Learning Environments (PLEs) (Symbaloo (aggregator), Delicious (social bookmarking), PBWorks (collaboration), Google+ (social networks), Wordspress (blogs), Wikispaces (wiki)). These Web 2.0 technologies have supported the realignment from a teacher-based pedagogy (didactic presentation) to a learner-based pedagogy (problem-based learning, project-based learning, blended learning), allowing greater learner autonomy. No longer is the educator the source of knowledge. Instead the educator has become the facilitator and mediator of the learner, involved in developing learner competencies to support life-long learning (continuous learning) in the 21st century. In this study, the self-regulated learning skills of thirty first-year university learners were explored by utilizing the Online Self-regulated Learning Questionnaire. Implementing an action research method, an intervention was affected towards improving the self-regulation skill set of the participants. Statistical significant results were obtained with increased self-regulated learning proficiency, positively impacting learner performance. Goal setting, time management, environment structuring, help seeking, task (learning) strategies and self-evaluation skills were confirmed as determinants of improved learner success.Keywords: andragogy, online self-regulated learning questionnaire, self-regulated learning, web 2.0
Procedia PDF Downloads 41727286 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24427285 Comparative Analysis of Dissimilarity Detection between Binary Images Based on Equivalency and Non-Equivalency of Image Inversion
Authors: Adnan A. Y. Mustafa
Abstract:
Image matching is a fundamental problem that arises frequently in many aspects of robot and computer vision. It can become a time-consuming process when matching images to a database consisting of hundreds of images, especially if the images are big. One approach to reducing the time complexity of the matching process is to reduce the search space in a pre-matching stage, by simply removing dissimilar images quickly. The Probabilistic Matching Model for Binary Images (PMMBI) showed that dissimilarity detection between binary images can be accomplished quickly by random pixel mapping and is size invariant. The model is based on the gamma binary similarity distance that recognizes an image and its inverse as containing the same scene and hence considers them to be the same image. However, in many applications, an image and its inverse are not treated as being the same but rather dissimilar. In this paper, we present a comparative analysis of dissimilarity detection between PMMBI based on the gamma binary similarity distance and a modified PMMBI model based on a similarity distance that does distinguish between an image and its inverse as being dissimilar.Keywords: binary image, dissimilarity detection, probabilistic matching model for binary images, image mapping
Procedia PDF Downloads 15327284 Assessment-Assisted and Relationship-Based Financial Advising: Using an Empirical Assessment to Understand Personal Investor Risk Tolerance in Professional Advising Relationships
Authors: Jerry Szatko, Edan L. Jorgensen, Stacia Jorgensen
Abstract:
A crucial component to the success of any financial advising relationship is for the financial professional to understand the perceptions, preferences and thought-processes carried by the financial clients they serve. Armed with this information, financial professionals are more quickly able to understand how they can tailor their approach to best match the individual preferences and needs of each personal investor. Our research explores the use of a quantitative assessment tool in the financial services industry to assist in the identification of the personal investor’s consumer behaviors, especially in terms of financial risk tolerance, as it relates to their financial decision making. Through this process, the Unitifi Consumer Insight Tool (UCIT) was created and refined to capture and categorize personal investor financial behavioral categories and the financial personality tendencies of individuals prior to the initiation of a financial advisement relationship. This paper discusses the use of this tool to place individuals in one of four behavior-based financial risk tolerance categories. Our discoveries and research were aided through administration of a web-based survey to a group of over 1,000 individuals. Our findings indicate that it is possible to use a quantitative assessment tool to assist in predicting the behavioral tendencies of personal consumers when faced with consumer financial risk and decisions.Keywords: behavior-based advising, financial relationship building, risk capacity based on behavior, risk tolerance, systematic way to assist in financial relationship building
Procedia PDF Downloads 16727283 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 18927282 Demand for Index Based Micro-Insurance (IBMI) in Ethiopia
Authors: Ashenafi Sileshi Etefa, Bezawit Worku Yenealem
Abstract:
Micro-insurance is a relatively new concept that is just being introduced in Ethiopia. For an agrarian economy dominated by small holder farming and vulnerable to natural disasters, mainly drought, the need for an Index-Based Micro Insurance (IBMI) is crucial. Since IBMI solves moral hazard, adverse selection, and access issues to poor clients, it is preferable over traditional insurance products. IBMI is being piloted in drought prone areas of Ethiopia with the aim of learning and expanding the service across the country. This article analyses the demand of IBMI and the barriers to demand and finds that the demand for IBMI has so far been constrained by lack of awareness, trust issues, costliness, and the level of basis risk; and recommends reducing the basis risk and increasing the role of government and farmer cooperatives.Keywords: agriculture, index based micro-insurance (IBMI), drought, micro-finance institution (MFI)
Procedia PDF Downloads 29027281 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 26327280 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts
Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár
Abstract:
The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting
Procedia PDF Downloads 183