Search results for: fundamental models
6392 Characterising Stable Model by Extended Labelled Dependency Graph
Authors: Asraful Islam
Abstract:
Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring
Procedia PDF Downloads 2126391 Studying the Impact of Soil Characteristics in Displacement of Retaining Walls Using Finite Element
Authors: Mojtaba Ahmadabadi, Akbar Masoudi, Morteza Rezai
Abstract:
In this paper, using the finite element method, the effect of soil and wall characteristics was investigated. Thirty and two different models were studied by different parameters. These studies could calculate displacement at any height of the wall for frictional-cohesive soils. The main purpose of this research is to determine the most effective soil characteristics in reducing the wall displacement. Comparing different models showed that the overall increase in internal friction angle, angle of friction between soil and wall and modulus of elasticity reduce the replacement of the wall. In addition, increase in special weight of soil will increase the wall displacement. Based on results, it can be said that all wall displacements were overturning and in the backfill, soil was bulging. Results show that the highest impact is seen in reducing wall displacement, internal friction angle, and the angle friction between soil and wall. One of the advantages of this study is taking into account all the parameters of the soil and walls replacement distribution in wall and backfill soil. In this paper, using the finite element method and considering all parameters of the soil, we investigated the impact of soil parameter in wall displacement. The aim of this study is to provide the best conditions in reducing the wall displacement and displacement wall and soil distribution.Keywords: retaining wall, fem, soil and wall interaction, angle of internal friction of the soil, wall displacement
Procedia PDF Downloads 3876390 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 956389 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1116388 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.Keywords: attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation
Procedia PDF Downloads 1756387 Synthesis and Evaluation of Photovoltaic Properties of an Organic Dye for Dye-Sensitized Solar Cells
Authors: M. Hosseinnejad, K. Gharanjig
Abstract:
In the present study, metal free organic dyes were prepared and used as photo-sensitizers in dye-sensitized solar cells. Double rhodanine was utilized as the fundamental electron acceptor group to which electron donor aldehyde with varying substituents was attached to produce new organic dye. This dye was first purified and then characterized by analytical techniques. Spectrophotometric evaluations of the prepared dye in solution and on a nano anatase TiO2 substrate were carried out in order to assess possible changes in the status of the dyes in different environments. The results show that the dye form j-type aggregates on the nano TiO2. Additionally, oxidation potential measurements were also carried out. Finally, dye sensitized solar cell based on synthesized dye was fabricated in order to determine the photovoltaic behavior and conversion efficiency of individual dye.Keywords: conversion efficiency, dye-sensitized solar cell, photovoltaic behavior, sensitizer
Procedia PDF Downloads 1836386 Corporate Governance of State-Owned Enterprises: A Comparative Analysis
Authors: Adeyemi Adebayo, Barry Ackers
Abstract:
This paper comparatively analyses the corporate governance of SOEs in South Africa and Singapore in the context of the World Bank’s framework for corporate governance of SOEs. This framework ensured that the analysis holistically covered key aspects of corporate governance of SOEs in these states. In order to ground our understanding of the paths taken by SOEs in the states, the paper presents the evolution and reforms of SOEs in the states before analyzing key aspects of their corporate governance. The analysis shows that even though SOEs in South Africa and Singapore are comparable in a number of ways, there are notable differences. In this context, this paper finds that the main difference between corporate governance of SOEs in South Africa and Singapore is their organizing model. Further, the analysis, among other findings, shows that SOEs Boards in Singapore are better remunerated. Further finding reveals that, even though some board members are politically connected, Singaporean SOEs boards are better constituted based on skills and experience compared to SOEs boards in South Africa. Overall, the analysis opens up new debates and as such concludes by providing avenues for further research.Keywords: corporate governance, comparative corporate governance, corporate governance framework, government business enterprises, government linked companies, organizing models, ownership models, state-owned companies, state-owned enterprises
Procedia PDF Downloads 2196385 Evaluating Radiative Feedback Mechanisms in Coastal West Africa Using Regional Climate Models
Authors: Akinnubi Rufus Temidayo
Abstract:
Coastal West Africa is highly sensitive to climate variability, driven by complex ocean-atmosphere interactions that shape temperature, precipitation, and extreme weather. Radiative feedback mechanisms—such as water vapor feedback, cloud-radiation interactions, and surface albedo—play a critical role in modulating these patterns. Yet, limited research addresses these feedbacks in climate models specific to West Africa’s coastal zones, creating challenges for accurate climate projections and adaptive planning. This study aims to evaluate the influence of radiative feedbacks on the coastal climate of West Africa by quantifying the effects of water vapor, cloud cover, and sea surface temperature (SST) on the region’s radiative balance. The study uses a regional climate model (RCM) to simulate feedbacks over a 20-year period (2005-2025) with high-resolution data from CORDEX and satellite observations. Key mechanisms investigated include (1) Water Vapor Feedback—the amplifying effect of humidity on warming, (2) Cloud-Radiation Interactions—the impact of cloud cover on radiation balance, especially during the West African Monsoon, and (3) Surface Albedo and Land-Use Changes—effects of urbanization and vegetation on the radiation budget. Preliminary results indicate that radiative feedbacks strongly influence seasonal climate variability in coastal West Africa. Water vapor feedback amplifies dry-season warming, cloud-radiation interactions moderate surface temperatures during monsoon seasons, and SST variations in the Atlantic affect the frequency and intensity of extreme rainfall events. The findings suggest that incorporating these feedbacks into climate planning can strengthen resilience to climate impacts in West African coastal communities. Further research should refine regional models to capture anthropogenic influences like greenhouse gas emissions, guiding sustainable urban and resource planning to mitigate climate risks.Keywords: west africa, radiative, climate, resilence, anthropogenic
Procedia PDF Downloads 96384 Error Amount in Viscoelasticity Analysis Depending on Time Step Size and Method used in ANSYS
Authors: A. Fettahoglu
Abstract:
Theory of viscoelasticity is used by many researchers to represent behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain like pavements of bridges can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell elements and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Subsequently, a practical problem, which has an analytical solution given in literature, is used to verify the applicability of viscoelasticity tool embedded in ANSYS. Finally, amount of error in the results of ANSYS is compared with the analytical results to indicate the influence of used method and time step size.Keywords: generalized Maxwell model, finite element method, prony series, time step size, viscoelasticity
Procedia PDF Downloads 3696383 Virtual Modelling of Turbulent Fibre Flow in a Low Consistency Refiner for a Sustainable and Energy Efficient Process
Authors: Simon Ingelsten, Anton Lundberg, Vijay Shankar, Lars-Olof Landström, Örjan Johansson
Abstract:
The flow in a low consistency disc refiner is simulated with the aim of identifying flow structures possibly being of importance for a future study to optimise the energy efficiency in refining processes. A simplified flow geometry is used, where a single groove of a refiner disc is modelled. Two different fibre models are used to simulate turbulent fibre suspension flow in the groove. The first model is a Bingham viscoplastic fluid model where the fibre suspension is treated as a non-Newtonian fluid with a yield stress. The second model is a new model proposed in a recent study where the suspended fibres effect on flow is accounted for through a modelled orientation distribution function (ODF). Both models yielded similar results with small differences. Certain flow characteristics that were expected and that was found in the literature were identified. Some of these flow characteristics may be of importance in a future process to optimise the refiner geometry to increase the energy efficiency. Further study and a more detailed flow model is; however, needed in order for the simulations to yield results valid for quantitative use in such an optimisation study. An outline of the next steps in such a study is proposed.Keywords: disc refiner, fibre flow, sustainability, turbulence modelling
Procedia PDF Downloads 4066382 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome
Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler
Abstract:
Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model
Procedia PDF Downloads 1536381 How Unicode Glyphs Revolutionized the Way We Communicate
Authors: Levi Corallo
Abstract:
Typed language made by humans on computers and cell phones has made a significant distinction from previous modes of written language exchanges. While acronyms remain one of the most predominant markings of typed language, another and perhaps more recent revolution in the way humans communicate has been with the use of symbols or glyphs, primarily Emojis—globally introduced on the iPhone keyboard by Apple in 2008. This paper seeks to analyze the use of symbols in typed communication from both a linguistic and machine learning perspective. The Unicode system will be explored and methods of encoding will be juxtaposed with the current machine and human perception. Topics in how typed symbol usage exists in conversation will be explored as well as topics across current research methods dealing with Emojis like sentiment analysis, predictive text models, and so on. This study proposes that sequential analysis is a significant feature for analyzing unicode characters in a corpus with machine learning. Current models that are trying to learn or translate the meaning of Emojis should be starting to learn using bi- and tri-grams of Emoji, as well as observing the relationship between combinations of different Emoji in tandem. The sociolinguistics of an entire new vernacular of language referred to here as ‘typed language’ will also be delineated across my analysis with unicode glyphs from both a semantic and technical perspective.Keywords: unicode, text symbols, emojis, glyphs, communication
Procedia PDF Downloads 1946380 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents
Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi
Abstract:
In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles
Procedia PDF Downloads 4446379 Excitation Modeling for Hidden Markov Model-Based Speech Synthesis Based on Wavelet Analysis
Authors: M. Kiran Reddy, K. Sreenivasa Rao
Abstract:
The conventional Hidden Markov Model (HMM)-based speech synthesis system (HTS) uses only a pulse excitation model, which significantly differs from natural excitation signal. Hence, buzziness can be perceived in the speech generated using HTS. This paper proposes an efficient excitation modeling method that can significantly reduce the buzziness, and improve the quality of HMM-based speech synthesis. The proposed approach models the pitch-synchronous residual frames extracted from the residual excitation signal. Each pitch synchronous residual frame is parameterized using 30 wavelet coefficients. These 30 wavelet coefficients are found to accurately capture the perceptually important information present in the residual waveform. In synthesis phase, the residual frames are reconstructed from the generated wavelet coefficients and are pitch-synchronously overlap-added to generate the excitation signal. The proposed excitation modeling method is integrated into HMM-based speech synthesis system. Evaluation results indicate that the speech synthesized by the proposed excitation model is significantly better than the speech generated using state-of-the-art excitation modeling methods.Keywords: excitation modeling, hidden Markov models, pitch-synchronous frames, speech synthesis, wavelet coefficients
Procedia PDF Downloads 2486378 Exploiting Kinetic and Kinematic Data to Plot Cyclograms for Managing the Rehabilitation Process of BKAs by Applying Neural Networks
Authors: L. Parisi
Abstract:
Kinematic data wisely correlate vector quantities in space to scalar parameters in time to assess the degree of symmetry between the intact limb and the amputated limb with respect to a normal model derived from the gait of control group participants. Furthermore, these particular data allow a doctor to preliminarily evaluate the usefulness of a certain rehabilitation therapy. Kinetic curves allow the analysis of ground reaction forces (GRFs) to assess the appropriateness of human motion. Electromyography (EMG) allows the analysis of the fundamental lower limb force contributions to quantify the level of gait asymmetry. However, the use of this technological tool is expensive and requires patient’s hospitalization. This research work suggests overcoming the above limitations by applying artificial neural networks.Keywords: kinetics, kinematics, cyclograms, neural networks, transtibial amputation
Procedia PDF Downloads 4436377 Naked Machismo: Uncovered Masculinity in an Israeli Home Design Campaign
Authors: Gilad Padva, Sigal Barak Brandes
Abstract:
This research centers on an unexpected Israeli advertising campaign for Elemento, a local furniture company, which eroticizes male nudity. The discussed campaign includes a series of printed ads that depict naked male models in effeminate positions. This campaign included a series of ads published in Haaretz, a small-scaled yet highly prestigious daily newspaper which is typically read by urban middle-upper-class left-winged Israelis. Apparently, this campaign embodies an alternative masculinity that challenges the prevalent machismo in Israeli society and advertising. Although some of the ads focus on young men in effeminate positions, they never expose their genitals and anuses, and their bodies are never permeable. The 2010s Elemento male models are seemingly contrasted to conventional representation of manhood in contemporary mainstream advertising. They display a somewhat inactive, passive and self-indulgent masculinity which involves 'conspicuous leisure'. In the process of commodity fetishism, the advertised furniture are emptied of the original meaning of their production, and then filled with new meanings in ways that both mystify the product and turn it into a fetish object. Yet, our research critically reconsiders this sensational campaign as sophisticated patriarchal parody that does not subvert but rather reconfirms and even fetishizes patriarchal premises; it parodizes effeminacy rather than the prevalent (Israeli) machismo. Following Pierre Bourdieu's politics of cultural taste, our research reconsiders and criticizes the male models' domesticated masculinity in a fantasized and cosmopolitan hedonistic habitus. Notwithstanding, we suggest that the Elemento campaign, despite its conformity, does question some Israeli and global axioms about gender roles, corporeal ideologies, idealized bodies, and domesticated phalluses and anuses. Although the naked truth is uncovered by this campaign, it does erect a vibrant discussion of contemporary masculinities and their exploitation in current mass consumption.Keywords: male body, campaign, advertising, gender studies, men's studies, Israeli culture, masculinity, parody, effeminacy
Procedia PDF Downloads 2116376 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 726375 Development of an Information System Based on the Establishment and Evaluation of Performance Rating by Application Part/Type of Remodeling Element Technologies
Authors: Sungwon Jung
Abstract:
The percentage of 20 years or older apartment houses in South Korea is approximately 20% (1.55 million houses), and the explosive increase of aged houses is expected around the first planned new towns. Accordingly, we should prepare for social issues such as difficulty of housing lease and degradation of housing performance. The improvement of performance of aged houses is essential for achieving the national energy and carbon reduction goals, and we should develop techniques to respond to the changing construction environment. Furthermore, we should develop a performance evaluation system that is appropriate for the demands of residents such as the improvement of remodeling floor plan by performance improvement in line with the residence type of the housing vulnerable groups such as low-income group and elderly people living alone. For this purpose, remodeling techniques and business models optimized for the target complexes must be spread through the development of various business models. In addition, it is necessary to improve the remodeling business by improving the laws and systems related to the improvement of the residential performance and to prepare techniques to respond to the increasing business demands. In other words, performance improvement and evaluation and knowledge systems need to be researched as new issues related to remodeling that has not been addressed in the existing research.Keywords: remodelling, performance evaluation, web-based system, big data
Procedia PDF Downloads 2246374 Fitness Action Recognition Based on MediaPipe
Authors: Zixuan Xu, Yichun Lou, Yang Song, Zihuai Lin
Abstract:
MediaPipe is an open-source machine learning computer vision framework that can be ported into a multi-platform environment, which makes it easier to use it to recognize the human activity. Based on this framework, many human recognition systems have been created, but the fundamental issue is the recognition of human behavior and posture. In this paper, two methods are proposed to recognize human gestures based on MediaPipe, the first one uses the Adaptive Boosting algorithm to recognize a series of fitness gestures, and the second one uses the Fast Dynamic Time Warping algorithm to recognize 413 continuous fitness actions. These two methods are also applicable to any human posture movement recognition.Keywords: computer vision, MediaPipe, adaptive boosting, fast dynamic time warping
Procedia PDF Downloads 1186373 Formation and Characterization of the Epoxy Resin-Porous Glass Interphases
Authors: Aleksander Ostrowski, Hugh J. Byrne, Roland Sanctuary
Abstract:
Investigation of the polymer interphases is an emerging field nowadays. In many cases interphases determine the functionality of a system. There is a great demand for exploration of fundamental understanding of the interphases and elucidation of their formation, dimensions dependent on various influencing factors, change of functional properties, etc. The epoxy applied on porous glass penetrates its pores with an extent dependent on the pore size, temperature and epoxy components mixing ratio. Developed over the recent time challenging sample preparation procedure allowed to produce very smooth epoxy-porous glass cross-sections. In this study, Raman spectroscopy was used to investigate the epoxy-porous glass interphases. It allowed for chemical differentiation between different regions at the cross-section and determination of the degree of cure of epoxy system in the porous glass.Keywords: interphases, Raman spectroscopy, epoxy, porous glass
Procedia PDF Downloads 3966372 Conceptualizing the Cyber Insecurity Risk in the Ethics of Automated Warfare
Authors: Otto Kakhidze, Hoda Alkhzaimi, Adam Ramey, Nasir Memon
Abstract:
This paper provides an alternative, cyber security based a conceptual framework for the ethics of automated warfare. The large body of work produced on fully or partially autonomous warfare systems tends to overlook malicious security factors as in the possibility of technical attacks on these systems when it comes to the moral and legal decision-making. The argument provides a risk-oriented justification to why technical malicious risks cannot be dismissed in legal, ethical and policy considerations when warfare models are being implemented and deployed. The assumptions of the paper are supported by providing a broader model that contains the perspective of technological vulnerabilities through the lenses of the Game Theory, Just War Theory as well as standard and non-standard defense ethics. The paper argues that a conventional risk-benefit analysis without considering ethical factors is insufficient for making legal and policy decisions on automated warfare. This approach will provide the substructure for security and defense experts as well as legal scholars, ethicists and decision theorists to work towards common justificatory grounds that will accommodate the technical security concerns that have been overlooked in the current legal and policy models.Keywords: automated warfare, ethics of automation, inherent hijacking, security vulnerabilities, risk, uncertainty
Procedia PDF Downloads 3576371 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification
Authors: Bing Li, Zhi Li, Yilong Yang
Abstract:
Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery
Procedia PDF Downloads 1356370 Statistical Analysis with Prediction Models of User Satisfaction in Software Project Factors
Authors: Katawut Kaewbanjong
Abstract:
We analyzed a volume of data and found significant user satisfaction in software project factors. A statistical significance analysis (logistic regression) and collinearity analysis determined the significance factors from a group of 71 pre-defined factors from 191 software projects in ISBSG Release 12. The eight prediction models used for testing the prediction potential of these factors were Neural network, k-NN, Naïve Bayes, Random forest, Decision tree, Gradient boosted tree, linear regression and logistic regression prediction model. Fifteen pre-defined factors were truly significant in predicting user satisfaction, and they provided 82.71% prediction accuracy when used with a neural network prediction model. These factors were client-server, personnel changes, total defects delivered, project inactive time, industry sector, application type, development type, how methodology was acquired, development techniques, decision making process, intended market, size estimate approach, size estimate method, cost recording method, and effort estimate method. These findings may benefit software development managers considerably.Keywords: prediction model, statistical analysis, software project, user satisfaction factor
Procedia PDF Downloads 1246369 Evaluating India's Smart Cities against the Sustainable Development Goals
Authors: Suneet Jagdev
Abstract:
17 Sustainable Development Goals were adopted by the world leaders in September 2015 at the United Nations Sustainable Development Summit. These goals were adopted by UN member states to promote prosperity, health and human rights while protecting the planet. Around the same time, the Government of India launched the Smart City Initiative to speed up development of state of the art infrastructure and services in 100 cities with a focus on sustainable and inclusive development. These cities are meant to become role models for other cities in India and promote sustainable regional development. This paper examines goals set under the Smart City Initiative and evaluates them in terms of the Sustainable Development Goals, using case studies of selected Smart Cities in India. The study concludes that most Smart City projects at present actually consist of individual solutions to individual problems identified in a community rather than comprehensive models for complex issues in cities across India. Systematic, logical and comparative analysis of important literature and data has been done, collected from government sources, government papers, research papers by various experts on the topic, and results from some online surveys. Case studies have been used for a graphical analysis highlighting the issues of migration, ecology, economy and social equity in these Smart Cities.Keywords: housing, migration, smart cities, sustainable development goals, urban infrastructure
Procedia PDF Downloads 4106368 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 3586367 Defective Autophagy Disturbs Neural Migration and Network Activity in hiPSC-Derived Cockayne Syndrome B Disease Models
Authors: Julia Kapr, Andrea Rossi, Haribaskar Ramachandran, Marius Pollet, Ilka Egger, Selina Dangeleit, Katharina Koch, Jean Krutmann, Ellen Fritsche
Abstract:
It is widely acknowledged that animal models do not always represent human disease. Especially human brain development is difficult to model in animals due to a variety of structural and functional species-specificities. This causes significant discrepancies between predicted and apparent drug efficacies in clinical trials and their subsequent failure. Emerging alternatives based on 3D in vitro approaches, such as human brain spheres or organoids, may in the future reduce and ultimately replace animal models. Here, we present a human induced pluripotent stem cell (hiPSC)-based 3D neural in a vitro disease model for the Cockayne Syndrome B (CSB). CSB is a rare hereditary disease and is accompanied by severe neurologic defects, such as microcephaly, ataxia and intellectual disability, with currently no treatment options. Therefore, the aim of this study is to investigate the molecular and cellular defects found in neural hiPSC-derived CSB models. Understanding the underlying pathology of CSB enables the development of treatment options. The two CSB models used in this study comprise a patient-derived hiPSC line and its isogenic control as well as a CSB-deficient cell line based on a healthy hiPSC line (IMR90-4) background thereby excluding genetic background-related effects. Neurally induced and differentiated brain sphere cultures were characterized via RNA Sequencing, western blot (WB), immunocytochemistry (ICC) and multielectrode arrays (MEAs). CSB-deficiency leads to an altered gene expression of markers for autophagy, focal adhesion and neural network formation. Cell migration was significantly reduced and electrical activity was significantly increased in the disease cell lines. These data hint that the cellular pathologies is possibly underlying CSB. By induction of autophagy, the migration phenotype could be partially rescued, suggesting a crucial role of disturbed autophagy in defective neural migration of the disease lines. Altered autophagy may also lead to inefficient mitophagy. Accordingly, disease cell lines were shown to have a lower mitochondrial base activity and a higher susceptibility to mitochondrial stress induced by rotenone. Since mitochondria play an important role in neurotransmitter cycling, we suggest that defective mitochondria may lead to altered electrical activity in the disease cell lines. Failure to clear the defective mitochondria by mitophagy and thus missing initiation cues for new mitochondrial production could potentiate this problem. With our data, we aim at establishing a disease adverse outcome pathway (AOP), thereby adding to the in-depth understanding of this multi-faced disorder and subsequently contributing to alternative drug development.Keywords: autophagy, disease modeling, in vitro, pluripotent stem cells
Procedia PDF Downloads 1206366 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics
Procedia PDF Downloads 4186365 Improved Photo-Active Layer Properties for Efficient Organic Solar Cells
Authors: Chahrazed Bendenia, Souhila Bendenia, Samia Moulebhar, Hanaa Merad-Dib, Sarra Merabet, Sid Ahmed Khantar, Baghdad Hadri
Abstract:
In recent years, organic solar cells (OSCs) have become the fundamental concern of researchers thanks to their advantages in terms of flexibility, manufacturing processes and low cost. The performance of these devices is influenced by various factors, such as the layers introduced in the stacking of the solar cell realized. In our work, the modeling of a reverse OSC under AM1.5G illumination will be determined. The photo-active polymer/fullerene layer will be analyzed from the polymer variation of this layer using the SCAPS simulator to extract the J-V characteristics: open circuit voltage (Voc), short circuit current (Jsc), filling factor (FF) and power conversion efficiency (η). The results obtained indicated that the materials used have a significant impact on improving the photovoltaic parameters of the devices studied.Keywords: solar, polymer, simulator, characteristics
Procedia PDF Downloads 786364 Fostering a Sense of Belonging in Hybrid Teams
Authors: Jam Harley
Abstract:
The COVID-19 epidemic accelerated the speed of change in the workplace. Overnight, several individuals shifted from co-location in an office to hybrid or remote work. The pandemic also expedited and intensified the need to address persistent leadership and management concerns, including digital transformation, remote management, leading through fast change, anxiety, and uncertainty. Nonetheless, many leaders have failed to address the problems left behind by the epidemic. In a fundamental work devoted to comprehending what constitutes a human need, Maslow reiterates similar descriptors in his explanation of belongingness as the human need to be accepted, acknowledged, respected, and appreciated by a community of other individuals. This study aims to investigate the lived experiences of dispersed hybrid team members in order to find leadership best practices that improve team performance and retention through an increased individual’s sense of belonging.Keywords: organizational change, belonging, diversity, equity
Procedia PDF Downloads 566363 Taxonomy of Threats and Vulnerabilities in Smart Grid Networks
Authors: Faisal Al Yahmadi, Muhammad R. Ahmed
Abstract:
Electric power is a fundamental necessity in the 21st century. Consequently, any break in electric power is probably going to affect the general activity. To make the power supply smooth and efficient, a smart grid network is introduced which uses communication technology. In any communication network, security is essential. It has been observed from several recent incidents that adversary causes an interruption to the operation of networks. In order to resolve the issues, it is vital to understand the threats and vulnerabilities associated with the smart grid networks. In this paper, we have investigated the threats and vulnerabilities in Smart Grid Networks (SGN) and the few solutions in the literature. Proposed solutions showed developments in electricity theft countermeasures, Denial of services attacks (DoS) and malicious injection attacks detection model, as well as malicious nodes detection using watchdog like techniques and other solutions.Keywords: smart grid network, security, threats, vulnerabilities
Procedia PDF Downloads 139