Search results for: rough sets
799 Investigations of Protein Aggregation Using Sequence and Structure Based Features
Authors: M. Michael Gromiha, A. Mary Thangakani, Sandeep Kumar, D. Velmurugan
Abstract:
The main cause of several neurodegenerative diseases such as Alzhemier, Parkinson, and spongiform encephalopathies is formation of amyloid fibrils and plaques in proteins. We have analyzed different sets of proteins and peptides to understand the influence of sequence-based features on protein aggregation process. The comparison of 373 pairs of homologous mesophilic and thermophilic proteins showed that aggregation-prone regions (APRs) are present in both. But, the thermophilic protein monomers show greater ability to ‘stow away’ the APRs in their hydrophobic cores and protect them from solvent exposure. The comparison of amyloid forming and amorphous b-aggregating hexapeptides suggested distinct preferences for specific residues at the six positions as well as all possible combinations of nine residue pairs. The compositions of residues at different positions and residue pairs have been converted into energy potentials and utilized for distinguishing between amyloid forming and amorphous b-aggregating peptides. Our method could correctly identify the amyloid forming peptides at an accuracy of 95-100% in different datasets of peptides.Keywords: aggregation, amyloids, thermophilic proteins, amino acid residues, machine learning techniques
Procedia PDF Downloads 614798 Comparison of Petrophysical Relationship for Soil Water Content Estimation at Peat Soil Area Using GPR Common-Offset Measurements
Authors: Nurul Izzati Abd Karim, Samira Albati Kamaruddin, Rozaimi Che Hasan
Abstract:
The appropriate petrophysical relationship is needed for Soil Water Content (SWC) estimation especially when using Ground Penetrating Radar (GPR). Ground penetrating radar is a geophysical tool that provides indirectly the parameter of SWC. This paper examines the performance of few published petrophysical relationships to obtain SWC estimates from in-situ GPR common- offset survey measurements with gravimetric measurements at peat soil area. Gravimetric measurements were conducted to support of GPR measurements for the accuracy assessment. Further, GPR with dual frequencies (250MHhz and 700MHz) were used in the survey measurements to obtain the dielectric permittivity. Three empirical equations (i.e., Roth’s equation, Schaap’s equation and Idi’s equation) were selected for the study, used to compute the soil water content from dielectric permittivity of the GPR profile. The results indicate that Schaap’s equation provides strong correlation with SWC as measured by GPR data sets and gravimetric measurements.Keywords: common-offset measurements, ground penetrating radar, petrophysical relationship, soil water content
Procedia PDF Downloads 252797 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 344796 Proteomic Analysis of Cytoplasmic Antigen from Brucella canis to Characterize Immunogenic Proteins Responded with Naturally Infected Dogs
Authors: J. J. Lee, S. R. Sung, E. J. Yum, S. C. Kim, B. H. Hyun, M. Her, H. S. Lee
Abstract:
Canine brucellosis is a critical problem in dogs leading to reproductive diseases which are mainly caused by Brucella canis. There are, nonetheless, not clear symptoms so that it may go unnoticed in most of the cases. Serodiagnosis for canine brucellosis has not been confirmed. Moreover, it has substantial difficulties due to broad cross-reactivity between the rough cell wall antigens of B. canis and heterospecific antibodies present in normal, uninfected dogs. Thus, this study was conducted to characterize the immunogenic proteins in cytoplasmic antigen (CPAg) of B. canis, which defined the antigenic sensitivity of the humoral antibody responses to B. canis-infected dogs. In analysis of B. canis CPAg, first, we extracted and purified the cytoplasmic proteins from cultured B. canis by hot-saline inactivation, ultrafiltration, sonication, and ultracentrifugation step by step according to the sonicated antigen extract method. For characterization of this antigen, we checked the sort and range of each protein on SDS-PAGE and verified the immunogenic proteins leading to reaction with antisera of B. canis-infected dogs. Selected immunodominant proteins were identified using MALDI-MS/MS. As a result, in an immunoproteomic assay, several polypeptides in CPAg on one or two-dimensional electrophoresis (DE) were specifically reacted to antisera from B. canis-infected dogs but not from non-infected dogs. The polypeptides with approximate 150, 80, 60, 52, 33, 26, 17, 15, 13, 11 kDa on 1-DE were dominantly recognized by antisera from B. canis-infected dogs. In the immunoblot profiles on 2-DE, ten immunodominant proteins in CPAg were detected with antisera of infected dogs between pI 3.5-6.5 at approximate 35 to 10 KDa, without any nonspecific reaction with sera in non-infected dogs. Ten immunodominant proteins identified by MALDI-MS/MS were identified as superoxide dismutase, bacteroferritin, amino acid ABC transporter substrate-binding protein, extracellular solute-binding protein family3, transaldolase, 26kDa periplasmic immunogenic protein, Rhizopine-binding protein, enoyl-CoA hydratase, arginase and type1 glyceraldehyde-3-phosphate dehydrogenase. Most of these proteins were determined by their cytoplasmic or periplasmic localization with metabolism and transporter functions. Consequently, this study discovered and identified the prominent immunogenic proteins in B. canis CPAg, highlighting that those antigenic proteins may accomplish a specific serodiagnosis for canine brucellosis. Furthermore, we will evaluate those immunodominant proteins for applying to the advanced diagnostic methods with high specificity and accuracy.Keywords: Brucella canis, Canine brucellosis, cytoplasmic antigen, immunogenic proteins
Procedia PDF Downloads 147795 A Linear Programming Approach to Assist Roster Construction Under a Salary Cap
Authors: Alex Contarino
Abstract:
Professional sports leagues often have a “free agency” period, during which teams may sign players with expiring contracts.To promote parity, many leagues operate under a salary cap that limits the amount teams can spend on player’s salaries in a given year. Similarly, in fantasy sports leagues, salary cap drafts are a popular method for selecting players. In order to sign a free agent in either setting, teams must bid against one another to buy the player’s services while ensuring the sum of their player’s salaries is below the salary cap. This paper models the bidding process for a free agent as a constrained optimization problem that can be solved using linear programming. The objective is to determine the largest bid that a team should offer the player subject to the constraint that the value of signing the player must exceed the value of using the salary cap elsewhere. Iteratively solving this optimization problem for each available free agent provides teams with an effective framework for maximizing the talent on their rosters. The utility of this approach is demonstrated for team sport roster construction and fantasy sport drafts, using recent data sets from both settings.Keywords: linear programming, optimization, roster management, salary cap
Procedia PDF Downloads 111794 Organizational Climate being Knowledge Sharing Oriented: A Fuzzy-Set Analysis
Authors: Paulo Lopes Henriques, Carla Curado
Abstract:
According to literature, knowledge sharing behaviors are influenced by organizational values and structures, namely organizational climate. The manuscript examines the antecedents of the knowledge sharing oriented organizational climate. According to theoretical expectations the study adopts the following explanatory conditions: knowledge sharing costs, knowledge sharing incentives, perceptions of knowledge sharing contributing to performance and tenure. The study confronts results considering two groups of firms: nondigital (firms without intranet) vs digital (firms with intranet). The paper applies fsQCA technique to analyze data by using fsQCA 2.5 software (www.fsqca.com) testing several conditional arguments to explain the outcome variable. Main results strengthen claims on the relevancy of the contribution of knowledge sharing to performance. Secondly, evidence brings tenure - an explanatory condition that is associated to organizational memory – to the spotlight. The study provides an original contribution not previously addressed in literature, since it identifies the sufficient conditions sets to knowledge sharing oriented organizational climate using fsQCA, which is, to our knowledge, a novel application of the technique.Keywords: fsQCA, knowledge sharing oriented organizational climate, knowledge sharing costs, knowledge sharing incentives
Procedia PDF Downloads 328793 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 274792 A Semiotic Approach to the Construction of Classical Identity in Indian Classical Music Videos
Authors: Jayakrishnan Narayanan, Sengamalam Periyasamy Dhanavel
Abstract:
Indian classical (Karnatik) music videos across various media platforms have followed an audio-visual pattern that conforms to its socio-cultural and quasi-religious identity. The present paper analyzes the semiotic variations between ‘pure Karnatik music videos’ and ‘independent/contemporary-collaborative music videos’ posted on social media by young professional Karnatik musicians. The paper analyzes these media texts by comparing their various structural sememes namely, the title, artists, music, narrative schemata, visuals, lighting, sound, and costumes. The paper argues that the pure Karnatik music videos are marked by the presence of certain recurring mythological or third level signifiers and that these signifiers and codes are marked by their conspicuous absence in the independent music videos produced by the same musicians. While the music and the musical instruments used in both these sets of music videos remain similar, the meaning that is abducted by the beholder in each case is entirely different. The paper also attempts to study the identity conflicts that are projected through these music videos and the extent to which the cultural connotations of Karnatik music govern the production of its music videos.Keywords: abduction, identity, media semiotics, music video
Procedia PDF Downloads 222791 A Genetic Algorithm for the Load Balance of Parallel Computational Fluid Dynamics Computation with Multi-Block Structured Mesh
Authors: Chunye Gong, Ming Tie, Jie Liu, Weimin Bao, Xinbiao Gan, Shengguo Li, Bo Yang, Xuguang Chen, Tiaojie Xiao, Yang Sun
Abstract:
Large-scale CFD simulation relies on high-performance parallel computing, and the load balance is the key role which affects the parallel efficiency. This paper focuses on the load-balancing problem of parallel CFD simulation with structured mesh. A mathematical model for this load-balancing problem is presented. The genetic algorithm, fitness computing, two-level code are designed. Optimal selector, robust operator, and local optimization operator are designed. The properties of the presented genetic algorithm are discussed in-depth. The effects of optimal selector, robust operator, and local optimization operator are proved by experiments. The experimental results of different test sets, DLR-F4, and aircraft design applications show the presented load-balancing algorithm is robust, quickly converged, and is useful in real engineering problems.Keywords: genetic algorithm, load-balancing algorithm, optimal variation, local optimization
Procedia PDF Downloads 185790 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 415789 Parameter Selection for Computationally Efficient Use of the Bfvrns Fully Homomorphic Encryption Scheme
Authors: Cavidan Yakupoglu, Kurt Rohloff
Abstract:
In this study, we aim to provide a novel parameter selection model for the BFVrns scheme, which is one of the prominent FHE schemes. Parameter selection in lattice-based FHE schemes is a practical challenges for experts or non-experts. Towards a solution to this problem, we introduce a hybrid principles-based approach that combines theoretical with experimental analyses. To begin, we use regression analysis to examine the parameters on the performance and security. The fact that the FHE parameters induce different behaviors on performance, security and Ciphertext Expansion Factor (CEF) that makes the process of parameter selection more challenging. To address this issue, We use a multi-objective optimization algorithm to select the optimum parameter set for performance, CEF and security at the same time. As a result of this optimization, we get an improved parameter set for better performance at a given security level by ensuring correctness and security against lattice attacks by providing at least 128-bit security. Our result enables average ~ 5x smaller CEF and mostly better performance in comparison to the parameter sets given in [1]. This approach can be considered a semiautomated parameter selection. These studies are conducted using the PALISADE homomorphic encryption library, which is a well-known HE library. The abstract goes here.Keywords: lattice cryptography, fully homomorphic encryption, parameter selection, LWE, RLWE
Procedia PDF Downloads 155788 MapReduce Logistic Regression Algorithms with RHadoop
Authors: Byung Ho Jung, Dong Hoon Lim
Abstract:
Logistic regression is a statistical method for analyzing a dataset in which there are one or more independent variables that determine an outcome. Logistic regression is used extensively in numerous disciplines, including the medical and social science fields. In this paper, we address the problem of estimating parameters in the logistic regression based on MapReduce framework with RHadoop that integrates R and Hadoop environment applicable to large scale data. There exist three learning algorithms for logistic regression, namely Gradient descent method, Cost minimization method and Newton-Rhapson's method. The Newton-Rhapson's method does not require a learning rate, while gradient descent and cost minimization methods need to manually pick a learning rate. The experimental results demonstrated that our learning algorithms using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also compared the performance of our Newton-Rhapson's method with gradient descent and cost minimization methods. The results showed that our newton's method appeared to be the most robust to all data tested.Keywords: big data, logistic regression, MapReduce, RHadoop
Procedia PDF Downloads 284787 A Model of Foam Density Prediction for Expanded Perlite Composites
Authors: M. Arifuzzaman, H. S. Kim
Abstract:
Multiple sets of variables associated with expanded perlite particle consolidation in foam manufacturing were analyzed to develop a model for predicting perlite foam density. The consolidation of perlite particles based on the flotation method and compaction involves numerous variables leading to the final perlite foam density. The variables include binder content, compaction ratio, perlite particle size, various perlite particle densities and porosities, and various volumes of perlite at different stages of process. The developed model was found to be useful not only for prediction of foam density but also for optimization between compaction ratio and binder content to achieve a desired density. Experimental verification was conducted using a range of foam densities (0.15–0.5 g/cm3) produced with a range of compaction ratios (1.5-3.5), a range of sodium silicate contents (0.05–0.35 g/ml) in dilution, a range of expanded perlite particle sizes (1-4 mm), and various perlite densities (such as skeletal, material, bulk, and envelope densities). A close agreement between predictions and experimental results was found.Keywords: expanded perlite, flotation method, foam density, model, prediction, sodium silicate
Procedia PDF Downloads 408786 Preparation of Silver and Silver-Gold, Universal and Repeatable, Surface Enhanced Raman Spectroscopy Platforms from SERSitive
Authors: Pawel Albrycht, Monika Ksiezopolska-Gocalska, Robert Holyst
Abstract:
Surface Enhanced Raman Spectroscopy (SERS) is a technique of growing importance not only in purely scientific research related to analytical chemistry. It finds more and more applications in broadly understood testing - medical, forensic, pharmaceutical, food - and everywhere works perfectly, on one condition that SERS substrates used for testing give adequate enhancement, repeatability, and homogeneity of SERS signal. This is a problem that has existed since the invention of this technique. Some laboratories use as SERS amplifiers colloids with silver or gold nanoparticles, others form rough silver or gold surfaces, but results are generally either weak or unrepeatable. Furthermore, these structures are very often highly specific - they amplify the signal only of a small group of compounds. It means that they work with some kinds of analytes but only with those which were used at a developer’s laboratory. When it comes to research on different compounds, completely new SERS 'substrates' are required. That underlay our decision to develop universal substrates for the SERS spectroscopy. Generally, each compound has different affinity for both silver and gold, which have the best SERS properties, and that's what depends on what signal we get in the SERS spectrum. Our task was to create the platform that gives a characteristic 'fingerprint' of the largest number of compounds with very high repeatability - even at the expense of the intensity of the enhancement factor (EF) (possibility to repeat research results is of the uttermost importance). As specified above SERS substrates are offered by SERSitive company. Applied method is based on cyclic potentiodynamic electrodeposition of silver or silver-gold nanoparticles on the conductive surface of ITO-coated glass at controlled temperature of the reaction solution. Silver nanoparticles are supplied in the form of silver nitrate (AgNO₃, 10 mM), gold nanoparticles are derived from tetrachloroauric acid (10 mM) while sodium sulfite (Na₂O₃, 5 mM) is used as a reductor. To limit and standardize the size of the SERS surface on which nanoparticles are deposited, photolithography is used. We secure the desired ITO-coated glass surface, and then etch the unprotected ITO layer which prevents nanoparticles from settling at these sites. On the prepared surface, we carry out the process described above, obtaining SERS surface with nanoparticles of sizes 50-400 nm. The SERSitive platforms present highly sensitivity (EF = 10⁵-10⁶), homogeneity and repeatability (70-80%).Keywords: electrodeposition, nanoparticles, Raman spectroscopy, SERS, SERSitive, SERS platforms, SERS substrates
Procedia PDF Downloads 155785 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network
Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse
Abstract:
Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM
Procedia PDF Downloads 236784 Application of Support Vector Machines in Forecasting Non-Residential
Authors: Wiwat Kittinaraporn, Napat Harnpornchai, Sutja Boonyachut
Abstract:
This paper deals with the application of a novel neural network technique, so-called Support Vector Machine (SVM). The objective of this study is to explore the variable and parameter of forecasting factors in the construction industry to build up forecasting model for construction quantity in Thailand. The scope of the research is to study the non-residential construction quantity in Thailand. There are 44 sets of yearly data available, ranging from 1965 to 2009. The correlation between economic indicators and construction demand with the lag of one year was developed by Apichat Buakla. The selected variables are used to develop SVM models to forecast the non-residential construction quantity in Thailand. The parameters are selected by using ten-fold cross-validation method. The results are indicated in term of Mean Absolute Percentage Error (MAPE). The MAPE value for the non-residential construction quantity predicted by Epsilon-SVR in corporation with Radial Basis Function (RBF) of kernel function type is 5.90. Analysis of the experimental results show that the support vector machine modelling technique can be applied to forecast construction quantity time series which is useful for decision planning and management purpose.Keywords: forecasting, non-residential, construction, support vector machines
Procedia PDF Downloads 434783 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia
Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza
Abstract:
In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant
Procedia PDF Downloads 467782 Wind Wave Modeling Using MIKE 21 SW Spectral Model
Authors: Pouya Molana, Zeinab Alimohammadi
Abstract:
Determining wind wave characteristics is essential for implementing projects related to Coastal and Marine engineering such as designing coastal and marine structures, estimating sediment transport rates and coastal erosion rates in order to predict significant wave height (H_s), this study applies the third generation spectral wave model, Mike 21 SW, along with CEM model. For SW model calibration and verification, two data sets of meteorology and wave spectroscopy are used. The model was exposed to time-varying wind power and the results showed that difference ratio mean, standard deviation of difference ratio and correlation coefficient in SW model for H_s parameter are 1.102, 0.279 and 0.983, respectively. Whereas, the difference ratio mean, standard deviation and correlation coefficient in The Choice Experiment Method (CEM) for the same parameter are 0.869, 1.317 and 0.8359, respectively. Comparing these expected results it is revealed that the Choice Experiment Method CEM has more errors in comparison to MIKE 21 SW third generation spectral wave model and higher correlation coefficient does not necessarily mean higher accuracy.Keywords: MIKE 21 SW, CEM method, significant wave height, difference ratio
Procedia PDF Downloads 402781 Including All Citizens Pathway (IACP): Transforming Post-Secondary Education Using Inclusion and Accessibility as Foundation
Authors: Fiona Whittington-Walsh
Abstract:
Including All Citizens Pathway (IACP) is addressing the systems wide discrimination that students with disabilities experience throughout the education system. IACP offers a wide, institutional support structure so that all students, including students with intellectual/developmental disabilities, are included and can succeed. The entire process from admissions, course selection, course instruction, graduation is designed to address systemic discrimination while supporting learners and faculty. The inclusive and accessible pedagogical model that is the foundation of IACP opens the doors of post-secondary education by making existing academic courses environments where all students can participate and succeed. IACP is about transforming teaching, not modifying, or adapting the curriculum or essential knowledge and skill sets that are required learning outcomes. Universal Design for Learning (UDL) principles are applied to instructional teaching strategies such as lectures, presentations, and assessment tools. Created in 2016 as a research pilot, IACP is one of the first fully inclusive for credit post-secondary options available. The pilot received numerous external and internal grants to support its initiative to investigate and assess the teaching strategies and techniques that support student learning of essential knowledge and skill sets. IACP pilot goals included: (1) provide a successful pilot as a model of inclusive and accessible pedagogy; (2) create a teacher’s guide to assist other instructors in transforming their teaching to reach a wide range of learners; (3) identify policy barriers located within the educational system; and (4) provide leadership and encouraging innovative and inclusive pedagogical practices. The pilot was a success and in 2020 the first cohort of students graduated with an exit credential that pre-exists IACP and consists of ten academic courses. The University has committed to continue IACP and has developed a sustainable model. Each new academic year a new cohort of IACP students starts their post-secondary educational journey, while two additional instructors are mentored with the pedagogy. The pedagogical foundation of IACP has far-reaching potential including, but not limited to, programs that offer services for international students whose first language is not English as well as influencing pedagogical reform in secondary and post-secondary education. IACP also supports universities in satisfying educational standards that are or will be included in accessibility/disability legislation. This session will present information about IACP, share examples of systems transformation, hear from students and instructors, and provide participatory experiential activities that demonstrate the transformative techniques. We will be drawing from the experiences of a recent course that explored research documenting the lived experiences of students with disabilities in post-secondary institutes in B.C (Whittington-Walsh). Students created theatrical scenes out of the data and presented it using Forum Theatre method. Forum Theatre was used to create conversations, challenge stereotypes, and build connections between ableism, disability justice, Indigeneity, and social policy.Keywords: disability justice, inclusive education, pedagogical transformation, systems transformation
Procedia PDF Downloads 8780 Extraction of Urban Land Features from TM Landsat Image Using the Land Features Index and Tasseled Cap Transformation
Authors: R. Bouhennache, T. Bouden, A. A. Taleb, A. Chaddad
Abstract:
In this paper we propose a method to map the urban areas. The method uses an arithmetic calculation processed from the land features indexes and Tasseled cap transformation TC of multi spectral Thematic Mapper Landsat TM image. For this purpose the derived indexes image from the original image such SAVI the soil adjusted vegetation index, UI the urban Index, and EBBI the enhanced built up and bareness index were staked to form a new image and the bands were uncorrelated, also the Spectral Angle Mapper (SAM) and Spectral Information Divergence (SID) supervised classification approaches were first applied on the new image TM data using the reference spectra of the spectral library and subsequently the four urban, vegetation, water and soil land cover categories were extracted with their accuracy assessment.The urban features were represented using a logic calculation applied to the brightness, UI-SAVI, NDBI-greenness and EBBI- brightness data sets. The study applied to Blida and mentioned that the urban features can be mapped with an accuracy ranging from 92 % to 95%.Keywords: EBBI, SAVI, Tasseled Cap Transformation, UI
Procedia PDF Downloads 482779 Noticing Nature: Benefits for Connectedness to Nature and Wellbeing
Authors: Dawn Watling, Lorraine Lecourtois, Adnan Levent, Ryan Jeffries, Aysha Bellamy
Abstract:
Mental health diagnoses are on the rise for adolescents worldwide, with many being unable to access support and increasing use of social prescribing time in nature. There is an increasing need to better understand the preventive benefits of spending time in nature. In this paper, research findings from 599 seven to 12-year-olds completed two sets of questionnaires (before the visit and after a walk in nature). Participants spent time in one of three different biodiverse habitats. Findings explore predictors (including age, sex, and mental health) of increases in connection to nature and well-being. Secondly, research findings from 313 eighteen to 87-year-olds who completed questionnaires and had their heart rate monitored, followed by a self-guided walk, will be discussed. Findings explore predictors (including age, sex, connectedness to nature, well-being, and heart rate as a proxy measure of stress) of increases in mood and feelings of restoration. The discussion will focus on the converging evidence for taking time to notice nature and the role of different environments in enhancing connection to nature, well-being, and positive mental health.Keywords: nature, connectedness to nature, social prescribing, wellbeing
Procedia PDF Downloads 31778 Genesis of Entrepreneur Business Models in New Ventures
Authors: Arash Najmaei, Jo Rhodes, Peter Lok, Zahra Sadeghinejad
Abstract:
In this article, we endeavor to explore how a new business model comes into existence in the Australian cloud-computing eco-system. Findings from multiple case study methodology reveal that to develop a business model new ventures adopt a three-phase approach. In the first phase, labelled as business model ideation (BMID) various ideas for a viable business model are generated from both internal and external networks of the entrepreneurial team and the most viable one is chosen. Strategic consensus and commitment are generated in the second phase. This phase is a business modelling strategic action phase. We labelled this phase as business model strategic commitment (BMSC) because through commitment and the subsequent actions of executives resources are pooled, coordinated and allocated to the business model. Three complementary sets of resources shape the business model: managerial (MnRs), marketing (MRs) and technological resources (TRs). The third phase is the market-test phase where the business model is reified through the delivery of the intended value to customers and conversion of revenue into profit. We labelled this phase business model actualization (BMAC). Theoretical and managerial implications of these findings will be discussed and several directions for future research will be illuminated.Keywords: entrepreneur business model, high-tech venture, resources, conversion of revenue
Procedia PDF Downloads 445777 Using Maximization Entropy in Developing a Filipino Phonetically Balanced Wordlist for a Phoneme-Level Speech Recognition System
Authors: John Lorenzo Bautista, Yoon-Joong Kim
Abstract:
In this paper, a set of Filipino Phonetically Balanced Word list consisting of 250 words (PBW250) were constructed for a phoneme-level ASR system for the Filipino language. The Entropy Maximization is used to obtain phonological balance in the list. Entropy of phonemes in a word is maximized, providing an optimal balance in each word’s phonological distribution using the Add-Delete Method (PBW algorithm) and is compared to the modified PBW algorithm implemented in a dynamic algorithm approach to obtain optimization. The gained entropy score of 4.2791 and 4.2902 for the PBW and modified algorithm respectively. The PBW250 was recorded by 40 respondents, each with 2 sets data. Recordings from 30 respondents were trained to produce an acoustic model that were tested using recordings from 10 respondents using the HMM Toolkit (HTK). The results of test gave the maximum accuracy rate of 97.77% for a speaker dependent test and 89.36% for a speaker independent test.Keywords: entropy maximization, Filipino language, Hidden Markov Model, phonetically balanced words, speech recognition
Procedia PDF Downloads 457776 Uses and Manufacturing of Beech Corrugated Plywood
Authors: Prochazka Jiri, Beranek Tomas, Podlena Milan, Zeidler Ales
Abstract:
The poster deals with the issue of ISO shipping containers’ sheathing made of corrugated plywood instead of traditional corrugated metal sheets. It was found that the corrugated plywood is a suitable material for the sheathing due to its great flexural strength perpendicular to the course of the wave, sufficient impact resistance, surface compressive strength and low weight. Three sample sets of different thicknesses 5, 8 and 10 mm were tested in the experiments. The tests have shown that the 5 cm corrugated plywood is the most suitable thickness for sheathing. Experiments showed that to increase bending strength at needed value, it was necessary to increase the weight of the timber only by 1.6%. Flat cash test showed that 5 mm corrugated plywood is sufficient material for sheathing from a mechanical point of view. Angle of corrugation was found as a very important factor which massively affects the mechanical properties. The impact strength test has shown that plywood is relatively tough material in direction of corrugation. It was calculated that the use of corrugated plywood sheathing for the containers can reduce the weight of the walls up to 75%. Corrugated plywood is also suitable material for web of I-joists and wide interior design applications.Keywords: corrugated plywood, veneer, beech plywood, ISO shipping container, I-joist
Procedia PDF Downloads 338775 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria
Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai
Abstract:
The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria
Procedia PDF Downloads 151774 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 69773 The Use of Artificial Intelligence to Curb Corruption in Brazil
Authors: Camila Penido Gomes
Abstract:
Over the past decade, an emerging body of research has been pointing to artificial intelligence´s great potential to improve the use of open data, increase transparency and curb corruption in the public sector. Nonetheless, studies on this subject are scant and usually lack evidence to validate AI-based technologies´ effectiveness in addressing corruption, especially in developing countries. Aiming to fill this void in the literature, this paper sets out to examine how AI has been deployed by civil society to improve the use of open data and prevent congresspeople from misusing public resources in Brazil. Building on the current debates and carrying out a systematic literature review and extensive document analyses, this research reveals that AI should not be deployed as one silver bullet to fight corruption. Instead, this technology is more powerful when adopted by a multidisciplinary team as a civic tool in conjunction with other strategies. This study makes considerable contributions, bringing to the forefront discussion a more accurate understanding of the factors that play a decisive role in the successful implementation of AI-based technologies in anti-corruption efforts.Keywords: artificial intelligence, civil society organization, corruption, open data, transparency
Procedia PDF Downloads 205772 A Model of Knowledge Management Culture Change
Authors: Reza Davoodi, Hamid Abbasi, Heidar Norouzi, Gholamabbas Alipourian
Abstract:
A dynamic model shaping a process of knowledge management (KM) culture change is suggested. It is aimed at providing effective KM of employees for obtaining desired results in an organization. The essential requirements for obtaining KM culture change are determined. The proposed model realizes these requirements. Dynamics of the model are expressed by a change of its parameters. It is adjusted to the dynamic process of KM culture change. Building the model includes elaboration and integration of interconnected components. The “Result” is a central component of the model. This component determines a desired organizational goal and possible directions of its attainment. The “Confront” component engenders constructive confrontation in an organization. For this reason, the employees are prompted toward KM culture change with the purpose of attaining the desired result. The “Assess” component realizes complex assessments of employee proposals by management and peers. The proposals are directed towards attaining the desired result in an organization. The “Reward” component sets the order of assigning rewards to employees based on the assessments of their proposals.Keywords: knowledge management, organizational culture change, employee, result
Procedia PDF Downloads 407771 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data
Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine
Procedia PDF Downloads 240770 Seismic Behavior of Steel Moment-Resisting Frames for Uplift Permitted in Near-Fault Regions
Authors: M. Tehranizadeh, E. Shoushtari Rezvani
Abstract:
Seismic performance of steel moment-resisting frame structures is investigated considering nonlinear soil-structure interaction (SSI) effects. 10-, 15-, and 20-story planar building frames with aspect ratio of 3 are designed in accordance with current building codes. Inelastic seismic demands of the superstructure are considered using concentrated plasticity model. The raft foundation system is designed for different soil types. Beam-on-nonlinear Winkler foundation (BNWF) is used to represent dynamic impedance of the underlying soil. Two sets of pulse-like as well as no-pulse near-fault earthquakes are used as input ground motions. The results show that the reduction in drift demands due to nonlinear SSI is characterized by a more uniform distribution pattern along the height when compared to the fixed-base and linear SSI condition. It is also concluded that beneficial effects of nonlinear SSI on displacement demands is more significant in case of pulse-like ground motions and performance level of the steel moment-resisting frames can be enhanced.Keywords: soil-structure interaction, uplifting, soil plasticity, near-fault earthquake, tall building
Procedia PDF Downloads 549