Search results for: digital transformation artificial intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6354

Search results for: digital transformation artificial intelligence

4884 Influence of Genotype, Explant, and Hormone Treatment on Agrobacterium-Transformation Success in Salix Callus Culture

Authors: Lukas J. Evans, Danilo D. Fernando

Abstract:

Shrub willows (Salix spp.) have many characteristics which make them suitable for a variety of applications such as riparian zone buffers, environmental contaminant sequestration, living snow fences, and biofuel production. In some cases, these functions are limited due to physical or financial obstacles associated with the number of individuals needed to reasonably satisfy that purpose. One way to increase the efficiency of willows is to bioengineer them with the genetic improvements suitable for the desired use. To accomplish this goal, an optimized in vitro transformation protocol via Agrobacterium tumefaciens is necessary to reliably express genes of interest. Therefore, the aim of this study is to observe the influence of tissue culture with different willow cultivars, hormones, and explants on the percentage of calli expressing reporter gene green florescent protein (GFP) to find ideal transformation conditions. Each callus was produced from 1 month old open-pollinated seedlings of three Salix miyabeana cultivars (‘SX61’, ‘WT1’, and ‘WT2’) from three different explants (lamina, petiole, and internodes). Explants were cultured for 1 month on an MS media with different concentrations of 6-Benzylaminopurine (BAP) and 1-Naphthaleneacetic acid (NAA) (No hormones, 1 mg⁻¹L BAP only, 3 mg⁻¹L NAA only, 1 mg⁻¹L BAP and 3 mg⁻¹L NAA, and 3 mg⁻¹L BAP and 1 mg⁻¹L NAA) to produce a callus. Samples were then treated with Agrobacterium tumefaciens at an OD600 of 0.6-0.8 to insert the transgene GFP for 30 minutes, co-cultivated for 72 hours, and selected on the same media type they were cultured on with added 7.5 mg⁻¹L of Hygromycin for 1 week before GFP visualization under a UV dissecting scope. Percentage of GFP expressing calli as well as the average number of fluorescing GFP units per callus were recorded and results were evaluated through an ANOVA test (α = 0.05). The WT1 internode-derived calli on media with 3 mg-1L NAA+1 mg⁻¹L BAP and mg⁻¹L BAP alone produced a significantly higher percentage of GFP expressing calli than each other group (19.1% and 19.4%, respectively). Additionally, The WT1 internode group cultured with 3 mg⁻¹L NAA+1 mg⁻¹L BAP produced an average of 2.89 GFP units per callus while the group cultivated with 1 mg⁻¹L BAP produced an average of 0.84 GFP units per callus. In conclusion, genotype, explant choice, and hormones all play a significant role in increasing successful transformation in willows. Future studies to produce whole callus GFP expression and subsequent plantlet regeneration are necessary for a complete willow transformation protocol.

Keywords: agrobacterium, callus, Salix, tissue culture

Procedia PDF Downloads 108
4883 Design of an Artificial Oil Body-Cyanogen Bromide Technology Platform for the Expression of Small Bioactive Peptide, Mastoparan B

Authors: Tzyy-Rong Jinn, Sheng-Kuo Hsieh, Yi-Ching Chung, Feng-Chia Hsieh

Abstract:

In this study, we attempted to develop a recombinant oleosin-based fusion expression strategy in Escherichia coli (E. coli) and coupled with the artificial oil bodies (AOB)-cyanogen bromide technology platform to produce bioactive mastoparan B (MP-B). As reported, the oleosin in AOB system plays a carrier (fusion with target protein), since oleosin possess two amphipathic regions (at the N-terminus and C-terminus), which result in the N-terminus and C-terminus of oleosin could be arranged on the surface of AOB. Thus, the target protein fused to the N-terminus or C-terminus of oleosin which also is exposed on the surface of AOB, and this process will greatly facilitate the subsequent separation and purification of target protein from AOB. In addition, oleosin, a unique structural protein of seed oil bodies, has the added advantage of helping the fused MP-B expressed in inclusion bodies, which can protect from proteolytic degradation. In this work, MP-B was fused to the C-terminus of oleosin and then was expressed in E. coli as an insoluble recombinant protein. As a consequence, we successfully developed a reliable recombinant oleosin-based fusion expression strategy in Escherichia coli and coupled with the artificial oil bodies (AOB)-cyanogen bromide technology platform to produce the small peptide, MP-B. Take together, this platform provides an insight into the production of active MP-B, which will facilitate studies and applications of this peptide in the future.

Keywords: artificial oil bodies, Escherichia coli, Oleosin-fusion protein, Mastoparan-B

Procedia PDF Downloads 440
4882 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 369
4881 New Gas Geothermometers for the Prediction of Subsurface Geothermal Temperatures: An Optimized Application of Artificial Neural Networks and Geochemometric Analysis

Authors: Edgar Santoyo, Daniel Perez-Zarate, Agustin Acevedo, Lorena Diaz-Gonzalez, Mirna Guevara

Abstract:

Four new gas geothermometers have been derived from a multivariate geo chemometric analysis of a geothermal fluid chemistry database, two of which use the natural logarithm of CO₂ and H2S concentrations (mmol/mol), respectively, and the other two use the natural logarithm of the H₂S/H₂ and CO₂/H₂ ratios. As a strict compilation criterion, the database was created with gas-phase composition of fluids and bottomhole temperatures (BHTM) measured in producing wells. The calibration of the geothermometers was based on the geochemical relationship existing between the gas-phase composition of well discharges and the equilibrium temperatures measured at bottomhole conditions. Multivariate statistical analysis together with the use of artificial neural networks (ANN) was successfully applied for correlating the gas-phase compositions and the BHTM. The predicted or simulated bottomhole temperatures (BHTANN), defined as output neurons or simulation targets, were statistically compared with measured temperatures (BHTM). The coefficients of the new geothermometers were obtained from an optimized self-adjusting training algorithm applied to approximately 2,080 ANN architectures with 15,000 simulation iterations each one. The self-adjusting training algorithm used the well-known Levenberg-Marquardt model, which was used to calculate: (i) the number of neurons of the hidden layer; (ii) the training factor and the training patterns of the ANN; (iii) the linear correlation coefficient, R; (iv) the synaptic weighting coefficients; and (v) the statistical parameter, Root Mean Squared Error (RMSE) to evaluate the prediction performance between the BHTM and the simulated BHTANN. The prediction performance of the new gas geothermometers together with those predictions inferred from sixteen well-known gas geothermometers (previously developed) was statistically evaluated by using an external database for avoiding a bias problem. Statistical evaluation was performed through the analysis of the lowest RMSE values computed among the predictions of all the gas geothermometers. The new gas geothermometers developed in this work have been successfully used for predicting subsurface temperatures in high-temperature geothermal systems of Mexico (e.g., Los Azufres, Mich., Los Humeros, Pue., and Cerro Prieto, B.C.) as well as in a blind geothermal system (known as Acoculco, Puebla). The last results of the gas geothermometers (inferred from gas-phase compositions of soil-gas bubble emissions) compare well with the temperature measured in two wells of the blind geothermal system of Acoculco, Puebla (México). Details of this new development are outlined in the present research work. Acknowledgements: The authors acknowledge the funding received from CeMIE-Geo P09 project (SENER-CONACyT).

Keywords: artificial intelligence, gas geochemistry, geochemometrics, geothermal energy

Procedia PDF Downloads 330
4880 Thermal Effects of Phase Transitions of Cerium and Neodymium

Authors: M. Khundadze, V. Varazashvili, N. Lejava, R. Jorbenadze

Abstract:

Phase transitions of cerium and neodymium are investigated by using high temperature scanning calorimeter (HT-1500 Seteram). For cerium two types of transformation are detected: at 350-372 K - hexagonal close packing (hcp) - face-centered cubic lattice (fcc) transition, and in 880-960K the face-centered cubic lattice (fcc) transformation into body-centered cubic lattice (bcc). For neodymium changing of hexagonal close packing (hcp) into body-centered cubic lattice (bcc) is detected at 1093-1113K. The thermal characteristics of transitions – enthalpy, entropy, temperature domains – are reported.

Keywords: cerium, calorimetry, neodymium, enthalpy of phase transitions, neodymium

Procedia PDF Downloads 349
4879 Definition of a Computing Independent Model and Rules for Transformation Focused on the Model-View-Controller Architecture

Authors: Vanessa Matias Leite, Jandira Guenka Palma, Flávio Henrique de Oliveira

Abstract:

This paper presents a model-oriented development approach to software development in the Model-View-Controller (MVC) architectural standard. This approach aims to expose a process of extractions of information from the models, in which through rules and syntax defined in this work, assists in the design of the initial model and its future conversions. The proposed paper presents a syntax based on the natural language, according to the rules agreed in the classic grammar of the Portuguese language, added to the rules of conversions generating models that follow the norms of the Object Management Group (OMG) and the Meta-Object Facility MOF.

Keywords: BNF Syntax, model driven architecture, model-view-controller, transformation, UML

Procedia PDF Downloads 378
4878 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study

Authors: Natália Botica, Luís Luís, Paulo Bernardes

Abstract:

The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.

Keywords: rock art, archaeology, iron age, 3D models

Procedia PDF Downloads 69
4877 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data

Authors: Minjuan Sun

Abstract:

Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.

Keywords: credit score, digital footprint, Fintech, machine learning

Procedia PDF Downloads 144
4876 Embodied Spiritualities and Emerging Search for Social Transformation: An Embodied Ethnographic Study of Yoga Practices in Medellin, Colombia

Authors: Lina M. Vidal

Abstract:

This paper discusses yoga practices involvement in both self-transformation and social transformations by means of an embodied ethnographic approach to different initiatives for social change in Medellín. In the context of gradual popularization of embodied spiritualities, yoga practices have opened their way in calls for social change in a performative perspective which involves collective experiences, reflections and production of embodied knowledge. Through the reflection on bodily dimension and corporal experience, this ethnographic approach acknowledges inter-corporality and somatic modes of attention during observations and personal experiences. In social change initiatives that include yoga practices were identified transformations of common understanding on social issues such as it is produced by institutionalized education, health system and other fields of knowledge. This is clearly visible in yoga projects for children in vulnerable conditions, homeless people, prisoners, and young people recovering from drug addiction. These projects are often promoted by organizations and networks, which incorporate individual life stories into collective experiences. Dissemination of yoga is heading to a broad institutional and cultural legitimation of yoga and of spirituality that impact different fields of social work and everyday life in general. This way, yoga is becoming an embodied activist way of life and a legitimate field for social work.

Keywords: embodied ethnography, Medellin, social transformation, embodied spiritualities, yoga practices

Procedia PDF Downloads 168
4875 Machine Learning Automatic Detection on Twitter Cyberbullying

Authors: Raghad A. Altowairgi

Abstract:

With the wide spread of social media platforms, young people tend to use them extensively as the first means of communication due to their ease and modernity. But these platforms often create a fertile ground for bullies to practice their aggressive behavior against their victims. Platform usage cannot be reduced, but intelligent mechanisms can be implemented to reduce the abuse. This is where machine learning comes in. Understanding and classifying text can be helpful in order to minimize the act of cyberbullying. Artificial intelligence techniques have expanded to formulate an applied tool to address the phenomenon of cyberbullying. In this research, machine learning models are built to classify text into two classes; cyberbullying and non-cyberbullying. After preprocessing the data in 4 stages; removing characters that do not provide meaningful information to the models, tokenization, removing stop words, and lowering text. BoW and TF-IDF are used as the main features for the five classifiers, which are; logistic regression, Naïve Bayes, Random Forest, XGboost, and Catboost classifiers. Each of them scores 92%, 90%, 92%, 91%, 86% respectively.

Keywords: cyberbullying, machine learning, Bag-of-Words, term frequency-inverse document frequency, natural language processing, Catboost

Procedia PDF Downloads 113
4874 Investigating the Impact of Job-Related and Organisational Factors on Employee Engagement: An Emotionally Relevant Approach Based on Psychological Climate and Organisational Emotional Intelligence (OEI)

Authors: Nuno Da Camara, Victor Dulewicz, Malcolm Higgs

Abstract:

Factors on employee engagement: In particular, although theorists have described the critical role of emotional cognition of the workplace environment as antecedents to employee engagement, empirical research on the impact of emotional cognition on employee engagement is limited. However, previous researchers have typically provided evidence of the link between emotional cognition of the workplace environment and workplace attitudes such as job satisfaction and organisational commitment. This study therefore aims to investigate the impact of emotional cognition of job, role, leader and organisation domains of the work environment – as represented by measures of psychological climate and organizational emotional intelligence (OEI) - on employee engagement. The research is based on a quantitative cross-sectional survey of employees in a UK charity organization (n=174). The research instruments applied include the psychological climate scale, the organisational emotional intelligence questionnaire (OEIQ) and the Utrecht Work Engagement Scale (UWES). The data were analysed using hierarchical regression and partial least squares (PLS) analytical techniques. The results of the study show that both psychological climate and OEI, which represent emotional cognition of job, role, leader and organisation domains in the workplace are significant drivers of employee engagement. In particular, the study found that a sense of contribution and challenge at work are the strongest drivers of vigour, dedication and absorption and highlights the importance of emotionally relevant approaches in furthering our understanding of workplace engagement.

Keywords: employee engagement, organisational emotional intelligence, psychological climate, workplace attitudes

Procedia PDF Downloads 483
4873 Ripple Effect Analysis of Government Investment for Research and Development by the Artificial Neural Networks

Authors: Hwayeon Song

Abstract:

The long-term purpose of research and development (R&D) programs is to strengthen national competitiveness by developing new knowledge and technologies. Thus, it is important to determine a proper budget for government programs to maintain the vigor of R&D when the total funding is tight due to the national deficit. In this regard, a ripple effect analysis for the budgetary changes in R&D programs is necessary as well as an investigation of the current status. This study proposes a new approach using Artificial Neural Networks (ANN) for both tasks. It particularly focuses on R&D programs related to Construction and Transportation (C&T) technology in Korea. First, key factors in C&T technology are explored to draw impact indicators in three areas: economy, society, and science and technology (S&T). Simultaneously, ANN is employed to evaluate the relationship between data variables. From this process, four major components in R&D including research personnel, expenses, management, and equipment are assessed. Then the ripple effect analysis is performed to see the changes in the hypothetical future by modifying current data. Any research findings can offer an alternative strategy about R&D programs as well as a new analysis tool.

Keywords: Artificial Neural Networks, construction and transportation technology, Government Research and Development, Ripple Effect

Procedia PDF Downloads 228
4872 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines

Authors: K. Shaji Mon, P. R. John Sreenidhi

Abstract:

In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.

Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer

Procedia PDF Downloads 238
4871 Estimating Poverty Levels from Satellite Imagery: A Comparison of Human Readers and an Artificial Intelligence Model

Authors: Ola Hall, Ibrahim Wahab, Thorsteinn Rognvaldsson, Mattias Ohlsson

Abstract:

The subfield of poverty and welfare estimation that applies machine learning tools and methods on satellite imagery is a nascent but rapidly growing one. This is in part driven by the sustainable development goal, whose overarching principle is that no region is left behind. Among other things, this requires that welfare levels can be accurately and rapidly estimated at different spatial scales and resolutions. Conventional tools of household surveys and interviews do not suffice in this regard. While they are useful for gaining a longitudinal understanding of the welfare levels of populations, they do not offer adequate spatial coverage for the accuracy that is needed, nor are their implementation sufficiently swift to gain an accurate insight into people and places. It is this void that satellite imagery fills. Previously, this was near-impossible to implement due to the sheer volume of data that needed processing. Recent advances in machine learning, especially the deep learning subtype, such as deep neural networks, have made this a rapidly growing area of scholarship. Despite their unprecedented levels of performance, such models lack transparency and explainability and thus have seen limited downstream applications as humans generally are apprehensive of techniques that are not inherently interpretable and trustworthy. While several studies have demonstrated the superhuman performance of AI models, none has directly compared the performance of such models and human readers in the domain of poverty studies. In the present study, we directly compare the performance of human readers and a DL model using different resolutions of satellite imagery to estimate the welfare levels of demographic and health survey clusters in Tanzania, using the wealth quintile ratings from the same survey as the ground truth data. The cluster-level imagery covers all 608 cluster locations, of which 428 were classified as rural. The imagery for the human readers was sourced from the Google Maps Platform at an ultra-high resolution of 0.6m per pixel at zoom level 18, while that of the machine learning model was sourced from the comparatively lower resolution Sentinel-2 10m per pixel data for the same cluster locations. Rank correlation coefficients of between 0.31 and 0.32 achieved by the human readers were much lower when compared to those attained by the machine learning model – 0.69-0.79. This superhuman performance by the model is even more significant given that it was trained on the relatively lower 10-meter resolution satellite data while the human readers estimated welfare levels from the higher 0.6m spatial resolution data from which key markers of poverty and slums – roofing and road quality – are discernible. It is important to note, however, that the human readers did not receive any training before ratings, and had this been done, their performance might have improved. The stellar performance of the model also comes with the inevitable shortfall relating to limited transparency and explainability. The findings have significant implications for attaining the objective of the current frontier of deep learning models in this domain of scholarship – eXplainable Artificial Intelligence through a collaborative rather than a comparative framework.

Keywords: poverty prediction, satellite imagery, human readers, machine learning, Tanzania

Procedia PDF Downloads 84
4870 Digital Twin Smart Hospital: A Guide for Implementation and Improvements

Authors: Enido Fabiano de Ramos, Ieda Kanashiro Makiya, Francisco I. Giocondo Cesar

Abstract:

This study investigates the application of Digital Twins (DT) in Smart Hospital Environments (SHE), through a bibliometric study and literature review, including comparison with the principles of Industry 4.0. It aims to analyze the current state of the implementation of digital twins in clinical and non-clinical operations in healthcare settings, identifying trends and challenges, comparing these practices with Industry 4.0 concepts and technologies, in order to present a basic framework including stages and maturity levels. The bibliometric methodology will allow mapping the existing scientific production on the theme, while the literature review will synthesize and critically analyze the relevant studies, highlighting pertinent methodologies and results, additionally the comparison with Industry 4.0 will provide insights on how the principles of automation, interconnectivity and digitalization can be applied in healthcare environments/operations, aiming at improvements in operational efficiency and quality of care. The results of this study will contribute to a deeper understanding of the potential of Digital Twins in Smart Hospitals, in addition to the future potential from the effective integration of Industry 4.0 concepts in this specific environment, presented through the practical framework, after all, the urgent need for changes addressed in this article is undeniable, as well as all their value contribution to human sustainability, designed in SDG3 – Health and well-being: ensuring that all citizens have a healthy life and well-being, at all ages and in all situations. We know that the validity of these relationships will be constantly discussed, and technology can always change the rules of the game.

Keywords: digital twin, smart hospital, healthcare operations, industry 4.0, SDG3, technology

Procedia PDF Downloads 38
4869 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 260
4868 Net Regularity and Its Ethical Implications on Internet Stake Holders

Authors: Nourhan Elshenawi

Abstract:

Net Neutrality (NN) is the principle of treating all online data the same without any prioritization of some over others. A research gap in current scholarship about “violations of NN” and the subsequent ethical concerns paves the way for the following research question: To what extent violations of NN entail ethical concerns and implications for Internet stakeholders? To answer this question, NR is examined using the two major action-based ethical theories, Kantian and Utilitarian, across the relevant Internet stakeholders. First some necessary IT background is provided that shapes how the Internet works and who the key stakeholders are. Following the IT background, the relationship between the stakeholders, users, Internet Service Providers (ISPs) and content providers is discussed and illustrated. Then some violations of NN that are currently occurring is covered, without attracting any attention from the general public from an ethical perspective, as a new term Net Regularity (NR). Afterwards, the current scholarship on NN and its violations are discussed, that are mainly from an economic and sociopolitical perspectives to highlight the lack of ethical discussions on the issue. Before moving on to the ethical analysis however, websites are presented as digital entities that are affected by NR and their happiness is measured using functionalism. The analysis concludes that NR is prone to an unethical treatment of Internet stakeholders in the perspective of both theories. Finally, the current Digital Divide in the world is presented to be able to better illustrate the implications of NR. The implications present the new Internet divide that will take place between individuals within society. Through answering the research question using ethical analysis, it attempts to shed some light on the issue of NR and what kind of society it would lead to. NR would not just lead to a divided society, but divided individuals that are separated by something greater than distance, the Internet.

Keywords: digital divide, digital entities, digital ontology, internet ethics, internet law, net neutrality, internet service providers, websites as beings

Procedia PDF Downloads 257
4867 Smart Airport: Application of Internet of Things for Confronting Airport Challenges

Authors: Ali Safaeianpour, Nima Shamandi

Abstract:

As air traffic expands, many airports have evolved into transit centers for people, information, and commerce, and technology implementation is an absolute part of airport development. Several challenges are in the way of implementing technology in an airport. Airport 4.0 proposes the "Smart Airport" concept, which focuses on using modern technologies such as Big Data, the Internet of Things (IoT), advanced biometric systems, blockchain, and cloud computing to alter and enhance passengers' journeys. Several common IoT concrete topics as partial keys to smart airports are discussed and introduced, ranging from automated check-in systems to exterior tracking processes, with the goal of enlightening more and more insightful ideas and proposals about smart airport solutions. IoT will dramatically alter people's lives by infusing intelligence, boosting the quality of life, and assembling it smarter. This paper reviews the approaches to transforming an airport into a smart airport and describes several enabling components of IoT and challenges that can hinder the implementation of a smart airport's function, which require to be addressed.

Keywords: airport 4.0, digital airport, smart airport, IoT

Procedia PDF Downloads 100
4866 Density Measurement of Underexpanded Jet Using Stripe Patterned Background Oriented Schlieren Method

Authors: Shinsuke Udagawa, Masato Yamagishi, Masanori Ota

Abstract:

The Schlieren method, which has been conventionally used to visualize high-speed flows, has disadvantages such as the complexity of the experimental setup and the inability to quantitatively analyze the amount of refraction of light. The Background Oriented Schlieren (BOS) method proposed by Meier is one of the measurement methods that solves the problems, as mentioned above. The refraction of light is used for BOS method same as the Schlieren method. The BOS method is characterized using a digital camera to capture the images of the background behind the observation area. The images are later analyzed by a computer to quantitatively detect the amount of shift of the background image. The experimental setup for BOS does not require concave mirrors, pinholes, or color filters, which are necessary in the conventional Schlieren method, thus simplifying the experimental setup. However, the defocusing of the observation results is caused in case of using BOS method. Since the focus of camera on the background image leads to defocusing of the observed object. The defocusing of object becomes greater with increasing the distance between the background and the object. On the other hand, the higher sensitivity can be obtained. Therefore, it is necessary to adjust the distance between the background and the object to be appropriate for the experiment, considering the relation between the defocus and the sensitivity. The purpose of this study is to experimentally clarify the effect of defocus on density field reconstruction. In this study, the visualization experiment of underexpanded jet using BOS measurement system with ronchi ruling as the background that we constructed, have been performed. The reservoir pressure of the jet and the distance between camera and axis of jet is fixed, and the distance between background and axis of jet has been changed as the parameter. The images have been later analyzed by using personal computer to quantitatively detect the amount of shift of the background image from the comparison between the background pattern and the captured image of underexpanded jet. The quantitatively measured amount of shift have been reconstructed into a density flow field using the Abel transformation and the Gradstone-Dale equation. From the experimental results, it is found that the reconstructed density image becomes blurring, and noise becomes decreasing with increasing the distance between background and axis of underexpanded jet. Consequently, it is cralified that the sensitivity constant should be greater than 20, and the circle of confusion diameter should be less than 2.7mm at least in this experimental setup.

Keywords: BOS method, underexpanded jet, abel transformation, density field visualization

Procedia PDF Downloads 49
4865 Mental Health Representation in Video Games

Authors: Leonid Rybakovski

Abstract:

Contemporary media offer a variety of themes for the diverse tastes of their audiences. The Digital games medium was mostly perceived as an instrument of entertainment. But being a part of global trends while constantly pushing the boundaries of storytelling in virtual reality and standing on the edge of technology also brings huge responsibility for game designers around the globe. A very recent emerging topic over the last years was an individual's mental state. In recent years there has been a shift in mental problems representations in commercial game releases such as Hell blade: Senua's Sacrifice and Sea of Solitude. The aim of this study is to research the approach of mental illness representation in media and digital games over the years and to suggest alternatives for putting characters who suffer from mental illness at the forefront of the storyline. This study traces dominant representations of characters with mental illness in digital games, reflecting the major change of the game industry toward inclusiveness. At the same time, the research embraces a hybrid approach to the academic study of digital games and includes the development of a game that follows a post-traumatic young girl, forcing the users to live her life through her eyes. The game prototype was developed as part of the Mdes Game Design and Development program and consisted of academic research and game development practices.

Keywords: framing analysis, mental condition, up keying, game mechanics

Procedia PDF Downloads 162
4864 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: classification, data mining, spam filtering, naive bayes, decision tree

Procedia PDF Downloads 397
4863 Research and Application of the Three-Dimensional Visualization Geological Modeling of Mine

Authors: Bin Wang, Yong Xu, Honggang Qu, Rongmei Liu, Zhenji Gao

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three dimensional visualization geological modeling of mine is the digital characterization of mineral deposit, and is one of the key technology of digital mine. The three-dimensional geological modeling is a technology that combines the geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in three-dimensional environment with computer technology, and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provided scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 57
4862 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 245
4861 An Examination of the Relationship between the Five Stages of the Yogacara Path to Enlightenment and the Ten Ox-Herding Pictures

Authors: Kyungbong Kim

Abstract:

This study proposed to compare and analyse the five stages of cultivating the Yogâcāra path and the spiritual journey in the Ten Ox-Herding Pictures. To achieve this, the study investigated the core concepts and practice methods of the two approaches and analysed their relations from the literature reviewed. The results showed that the end goal of the two approaches is the same, the attainment of Buddhahood, with the two having common characteristics including the practice of being aware of the impermanent and non-self, and the fulfilling benefit of sentient beings. The results suggest that our Buddhist practice system needs to sincerely consider the realistic ways by which one can help people in agony in contemporary society, not by emphasizing on the enlightenment through a specific practice way for all people, but by tailored practice methods based on each one's faculties in understanding Buddhism.

Keywords: transformation of consciousness to wisdom, enlightenment, the five stages of cultivating the Yogacāra path, the Ten Ox-Herding Pictures, transformation of the basis

Procedia PDF Downloads 251
4860 Neural Synchronization - The Brain’s Transfer of Sensory Data

Authors: David Edgar

Abstract:

To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.

Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)

Procedia PDF Downloads 111
4859 FISCEAPP: FIsh Skin Color Evaluation APPlication

Authors: J. Urban, Á. S. Botella, L. E. Robaina, A. Bárta, P. Souček, P. Císař, Š. Papáček, L. M. Domínguez

Abstract:

Skin coloration in fish is of great physiological, behavioral and ecological importance and can be considered as an index of animal welfare in aquaculture as well as an important quality factor in the retail value. Currently, in order to compare color in animals fed on different diets, biochemical analysis, and colorimetry of fished, mildly anesthetized or dead body, are very accurate and meaningful measurements. The noninvasive method using digital images of the fish body was developed as a standalone application. This application deals with the computation burden and memory consumption of large input files, optimizing piece wise processing and analysis with the memory/computation time ratio. For the comparison of color distributions of various experiments and different color spaces (RGB, CIE L*a*b*) the comparable semi-equidistant binning of multi channels representation is introduced. It is derived from the knowledge of quantization levels and Freedman-Diaconis rule. The color calibrations and camera responsivity function were necessary part of the measurement process.

Keywords: color distribution, fish skin color, piecewise transformation, object to background segmentation

Procedia PDF Downloads 246
4858 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 391
4857 Songwriting in the Postdigital Age: Using TikTok and Instagram as Online Informal Learning Technologies

Authors: Matthias Haenisch, Marc Godau, Julia Barreiro, Dominik Maxelon

Abstract:

In times of ubiquitous digitalization and the increasing entanglement of humans and technologies in musical practices in the 21st century, it is to be asked, how popular musicians learn in the (post)digital Age. Against the backdrop of the increasing interest in transferring informal learning practices into formal settings of music education the interdisciplinary research association »MusCoDA – Musical Communities in the (Post)Digital Age« (University of Erfurt/University of Applied Sciences Clara Hoffbauer Potsdam, funded by the German Ministry of Education and Research, pursues the goal to derive an empirical model of collective songwriting practices from the study of informal lelearningf songwriters and bands that can be translated into pedagogical concepts for music education in schools. Drawing on concepts from Community of Musical Practice and Actor Network Theory, lelearnings considered not only as social practice and as participation in online and offline communities, but also as an effect of heterogeneous networks composed of human and non-human actors. Learning is not seen as an individual, cognitive process, but as the formation and transformation of actor networks, i.e., as a practice of assembling and mediating humans and technologies. Based on video stimulated recall interviews and videography of online and offline activities, songwriting practices are followed from the initial idea to different forms of performance and distribution. The data evaluation combines coding and mapping methods of Grounded Theory Methodology and Situational Analysis. This results in network maps in which both the temporality of creative practices and the material and spatial relations of human and technological actors are reconstructed. In addition, positional analyses document the power relations between the participants that structure the learning process of the field. In the area of online informal lelearninginitial key research findings reveal a transformation of the learning subject through the specific technological affordances of TikTok and Instagram and the accompanying changes in the learning practices of the corresponding online communities. Learning is explicitly shaped by the material agency of online tools and features and the social practices entangled with these technologies. Thus, any human online community member can be invited to directly intervene in creative decisions that contribute to the further compositional and structural development of songs. At the same time, participants can provide each other with intimate insights into songwriting processes in progress and have the opportunity to perform together with strangers and idols. Online Lelearnings characterized by an increase in social proximity, distribution of creative agency and informational exchange between participants. While it seems obvious that traditional notions not only of lelearningut also of the learning subject cannot be maintained, the question arises, how exactly the observed informal learning practices and the subject that emerges from the use of social media as online learning technologies can be transferred into contexts of formal learning

Keywords: informal learning, postdigitality, songwriting, actor-network theory, community of musical practice, social media, TikTok, Instagram, apps

Procedia PDF Downloads 109
4856 Democracy in Gaming: An Artificial Neural Network Based Approach towards Rule Evolution

Authors: Nelvin Joseph, K. Krishna Milan Rao, Praveen Dwarakanath

Abstract:

The explosive growth of Smart phones around the world has led to the shift of the primary engagement tool for entertainment from traditional consoles and music players to an all integrated device. Augmented Reality is the next big shift in bringing in a new dimension to the play. The paper explores the construct and working of the community engine in Delta T – an Augmented Reality game that allows users to evolve rules in the game basis collective bargaining mirroring democracy even in a gaming world.

Keywords: augmented reality, artificial neural networks, mobile application, human computer interaction, community engine

Procedia PDF Downloads 308
4855 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology

Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani

Abstract:

Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.

Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography

Procedia PDF Downloads 411