Search results for: digital transformation artificial intelligence
5082 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering
Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel
Abstract:
Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.Keywords: classification, data mining, spam filtering, naive bayes, decision tree
Procedia PDF Downloads 4135081 Thermal Effects of Phase Transitions of Cerium and Neodymium
Authors: M. Khundadze, V. Varazashvili, N. Lejava, R. Jorbenadze
Abstract:
Phase transitions of cerium and neodymium are investigated by using high temperature scanning calorimeter (HT-1500 Seteram). For cerium two types of transformation are detected: at 350-372 K - hexagonal close packing (hcp) - face-centered cubic lattice (fcc) transition, and in 880-960K the face-centered cubic lattice (fcc) transformation into body-centered cubic lattice (bcc). For neodymium changing of hexagonal close packing (hcp) into body-centered cubic lattice (bcc) is detected at 1093-1113K. The thermal characteristics of transitions – enthalpy, entropy, temperature domains – are reported.Keywords: cerium, calorimetry, neodymium, enthalpy of phase transitions, neodymium
Procedia PDF Downloads 3695080 A Review on Water Models of Surface Water Environment
Authors: Shahbaz G. Hassan
Abstract:
Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.Keywords: empirical models, mathematical, statistical, water quality
Procedia PDF Downloads 2655079 Definition of a Computing Independent Model and Rules for Transformation Focused on the Model-View-Controller Architecture
Authors: Vanessa Matias Leite, Jandira Guenka Palma, Flávio Henrique de Oliveira
Abstract:
This paper presents a model-oriented development approach to software development in the Model-View-Controller (MVC) architectural standard. This approach aims to expose a process of extractions of information from the models, in which through rules and syntax defined in this work, assists in the design of the initial model and its future conversions. The proposed paper presents a syntax based on the natural language, according to the rules agreed in the classic grammar of the Portuguese language, added to the rules of conversions generating models that follow the norms of the Object Management Group (OMG) and the Meta-Object Facility MOF.Keywords: BNF Syntax, model driven architecture, model-view-controller, transformation, UML
Procedia PDF Downloads 3955078 Multivariate Analysis of the Relationship between Professional Burnout, Emotional Intelligence and Health Level in Teachers University of Guayaquil
Authors: Viloria Marin Hermes, Paredes Santiago Maritza, Viloria Paredes Jonathan
Abstract:
The aim of this study is to assess the prevalence of Burnout syndrome in a sample of 600 professors at the University of Guayaquil (Ecuador) using the Maslach Burnout Inventory (M.B.I.). In addition, assessment was made of the effects on health from professional burnout using the General Health Questionnaire (G.H.Q.-28), and the influence of Emotional Intelligence on prevention of its symptoms using the Spanish version of the Trait Meta-Mood Scale (T.M.M.S.-24). After confirmation of the underlying factor structure, the three measurement tools showed high levels of internal consistency, and specific cut-off points were proposed for the group of Latin American academics in the M.B.I. Statistical analysis showed the syndrome is present extensively, particularly on medium levels, with notably low scores given for Professional Self-Esteem. The application of Canonical Correspondence Analysis revealed that low levels of self-esteem are related to depression, with a lack of personal resources related to anxiety and insomnia, whereas the ability to perceive and control emotions and feelings improves perceptions of professional effectiveness and performance.Keywords: burnout, academics, emotional intelligence, general health, canonical correspondence analysis
Procedia PDF Downloads 3705077 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps
Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam
Abstract:
GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.Keywords: noise, image, GIS, digital map, inpainting
Procedia PDF Downloads 3535076 Embodied Spiritualities and Emerging Search for Social Transformation: An Embodied Ethnographic Study of Yoga Practices in Medellin, Colombia
Authors: Lina M. Vidal
Abstract:
This paper discusses yoga practices involvement in both self-transformation and social transformations by means of an embodied ethnographic approach to different initiatives for social change in Medellín. In the context of gradual popularization of embodied spiritualities, yoga practices have opened their way in calls for social change in a performative perspective which involves collective experiences, reflections and production of embodied knowledge. Through the reflection on bodily dimension and corporal experience, this ethnographic approach acknowledges inter-corporality and somatic modes of attention during observations and personal experiences. In social change initiatives that include yoga practices were identified transformations of common understanding on social issues such as it is produced by institutionalized education, health system and other fields of knowledge. This is clearly visible in yoga projects for children in vulnerable conditions, homeless people, prisoners, and young people recovering from drug addiction. These projects are often promoted by organizations and networks, which incorporate individual life stories into collective experiences. Dissemination of yoga is heading to a broad institutional and cultural legitimation of yoga and of spirituality that impact different fields of social work and everyday life in general. This way, yoga is becoming an embodied activist way of life and a legitimate field for social work.Keywords: embodied ethnography, Medellin, social transformation, embodied spiritualities, yoga practices
Procedia PDF Downloads 1915075 A Research Agenda for Learner Models for Adaptive Educational Digital Learning Environments
Authors: Felix Böck
Abstract:
Nowadays, data about learners and their digital activities are collected, which could help educational institutions to better understand learning processes, improve them and be able to provide better learning assistance. In this research project, custom knowledge- and data-driven recommendation algorithms will be used to offer students in higher education integrated learning assistance. The pre-requisite for this is a learner model that is as comprehensive as possible, which should first be created and then kept up-to-date largely automatically for being able to individualize and personalize the learning experience. In order to create such a learner model, a roadmap is presented that describes the individual phases up to the creation and evaluation of the finished model. The methodological process for the research project is disclosed, and the research question of how learners can be supported in their learning with personalized, customized learning recommendations is explored.Keywords: research agenda, user model, learner model, higher education, adaptive educational digital learning environments, personalized learning paths, recommendation system, adaptation, personalization
Procedia PDF Downloads 205074 Artificial Neural Network in Predicting the Soil Response in the Discrete Element Method Simulation
Authors: Zhaofeng Li, Jun Kang Chow, Yu-Hsing Wang
Abstract:
This paper attempts to bridge the soil properties and the mechanical response of soil in the discrete element method (DEM) simulation. The artificial neural network (ANN) was therefore adopted, aiming to reproduce the stress-strain-volumetric response when soil properties are given. 31 biaxial shearing tests with varying soil parameters (e.g., initial void ratio and interparticle friction coefficient) were generated using the DEM simulations. Based on these 45 sets of training data, a three-layer neural network was established which can output the entire stress-strain-volumetric curve during the shearing process from the input soil parameters. Beyond the training data, 2 additional sets of data were generated to examine the validity of the network, and the stress-strain-volumetric curves for both cases were well reproduced using this network. Overall, the ANN was found promising in predicting the soil behavior and reducing repetitive simulation work.Keywords: artificial neural network, discrete element method, soil properties, stress-strain-volumetric response
Procedia PDF Downloads 3975073 Photo-Fenton Decolorization of Methylene Blue Adsolubilized on Co2+ -Embedded Alumina Surface: Comparison of Process Modeling through Response Surface Methodology and Artificial Neural Network
Authors: Prateeksha Mahamallik, Anjali Pal
Abstract:
In the present study, Co(II)-adsolubilized surfactant modified alumina (SMA) was prepared, and methylene blue (MB) degradation was carried out on Co-SMA surface by visible light photo-Fenton process. The entire reaction proceeded on solid surface as MB was embedded on Co-SMA surface. The reaction followed zero order kinetics. Response surface methodology (RSM) and artificial neural network (ANN) were used for modeling the decolorization of MB by photo-Fenton process as a function of dose of Co-SMA (10, 20 and 30 g/L), initial concentration of MB (10, 20 and 30 mg/L), concentration of H2O2 (174.4, 348.8 and 523.2 mM) and reaction time (30, 45 and 60 min). The prediction capabilities of both the methodologies (RSM and ANN) were compared on the basis of correlation coefficient (R2), root mean square error (RMSE), standard error of prediction (SEP), relative percent deviation (RPD). Due to lower value of RMSE (1.27), SEP (2.06) and RPD (1.17) and higher value of R2 (0.9966), ANN was proved to be more accurate than RSM in order to predict decolorization efficiency.Keywords: adsolubilization, artificial neural network, methylene blue, photo-fenton process, response surface methodology
Procedia PDF Downloads 2555072 Visibility of the Borders of the Mandibular Canal: A Comparative in Vitro Study Using Digital Panoramic Radiography, Reformatted Panoramic Radiography and Cross Sectional Cone Beam Computed Tomography
Authors: Keerthilatha Pai, Sakshi Kamra
Abstract:
Objectives: Determining the position of the mandibular canal prior to implant placement and surgeries of the posterior mandible are important to avoid the nerve injury. The visibility of the mandibular canal varies according to the imaging modality. Although panoramic radiography is the most common, slowly cone beam computed tomography is replacing it. This study was conducted with an aim to determine and compare the visibility of superior and inferior borders of the mandibular canal in digital panoramic radiograph, reformatted panoramic radiograph and cross-sectional images of cone beam computed tomography. Study design: digital panoramic, reformatted panoramic radiograph and cross sectional CBCT images of 25 human mandibles were evaluated for the visibility of the superior and inferior borders of the mandibular canal according to a 5 point scoring criteria. Also, the canal was evaluated as completely visible, partially visible and not visible. The mean scores and visibility percentage of all the imaging modalities were determined and compared. The interobserver and intraobserver agreement in the visualization of the superior and inferior borders of the mandibular canal were determined. Results: The superior and inferior borders of the mandibular canal were completely visible in 47% of the samples in digital panoramic, 63% in reformatted panoramic and 75.6% in CBCT cross-sectional images. The mandibular canal was invisible in 24% of samples in digital panoramic, 19% in reformatted panoramic and 2% in cross-sectional CBCT images. Maximum visibility was seen in Zone 5 and least visibility in Zone 1. On comparison of all the imaging modalities, CBCT cross-sectional images showed better visibility of superior border in Zones 2,3,4,6 and inferior border in Zones 2,3,4,6. The difference was statistically significant. Conclusion: CBCT cross-sectional images were much superior in the visualization of the mandibular canal in comparison to reformatted and digital panoramic radiographs. The inferior border was better visualized in comparison to the superior border in digital panoramic imaging. The mandibular canal was maximumly visible in posterior one-third region of the mandible and the visibility decreased towards the mental foramen.Keywords: cone beam computed tomography, mandibular canal, reformatted panoramic radiograph, visualization
Procedia PDF Downloads 1285071 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques
Authors: John Onyima, Ikechukwu Ezepue
Abstract:
Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection
Procedia PDF Downloads 3085070 Design of an Artificial Oil Body-Cyanogen Bromide Technology Platform for the Expression of Small Bioactive Peptide, Mastoparan B
Authors: Tzyy-Rong Jinn, Sheng-Kuo Hsieh, Yi-Ching Chung, Feng-Chia Hsieh
Abstract:
In this study, we attempted to develop a recombinant oleosin-based fusion expression strategy in Escherichia coli (E. coli) and coupled with the artificial oil bodies (AOB)-cyanogen bromide technology platform to produce bioactive mastoparan B (MP-B). As reported, the oleosin in AOB system plays a carrier (fusion with target protein), since oleosin possess two amphipathic regions (at the N-terminus and C-terminus), which result in the N-terminus and C-terminus of oleosin could be arranged on the surface of AOB. Thus, the target protein fused to the N-terminus or C-terminus of oleosin which also is exposed on the surface of AOB, and this process will greatly facilitate the subsequent separation and purification of target protein from AOB. In addition, oleosin, a unique structural protein of seed oil bodies, has the added advantage of helping the fused MP-B expressed in inclusion bodies, which can protect from proteolytic degradation. In this work, MP-B was fused to the C-terminus of oleosin and then was expressed in E. coli as an insoluble recombinant protein. As a consequence, we successfully developed a reliable recombinant oleosin-based fusion expression strategy in Escherichia coli and coupled with the artificial oil bodies (AOB)-cyanogen bromide technology platform to produce the small peptide, MP-B. Take together, this platform provides an insight into the production of active MP-B, which will facilitate studies and applications of this peptide in the future.Keywords: artificial oil bodies, Escherichia coli, Oleosin-fusion protein, Mastoparan-B
Procedia PDF Downloads 4535069 The Impact of Artificial Intelligence on Construction Engineering
Authors: Mina Fawzy Ishak Gad Elsaid
Abstract:
There is a strong link between technology and development. Architecture as a profession is a call to service and society. Maybe next to soldiers, engineers and patriots. However, unlike soldiers, they always remain employees of society under all circumstances. Despite the construction profession's role in society, there appears to be a lack of respect as some projects fail. This paper focuses on the need to improve development engineering performance in developing countries, using engineering education in Nigerian universities as a tool for discussion. A purposeful survey, interviews and focus group discussions were conducted on one hundred and twenty (120) prominent companies in Nigeria. The subject is approached through a large number of projects that companies have been involved in from the planning stage, some of which have been completed and even reached the maintenance and monitoring stage. It has been found that certain factors beyond the control of engineers are hindering the full development and success of the construction sector in developing countries. The main culprit is corruption and its eradication will put the country on a stable path to develop construction and combat poverty.Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management
Procedia PDF Downloads 465068 The Impact of Artificial Intelligence on Construction Engineering
Authors: Haneen Joseph Habib Yeldoka
Abstract:
There is a strong link between technology and development. Architecture as a profession is a call to service and society. Maybe next to soldiers, engineers and patriots. However, unlike soldiers, they always remain employees of society under all circumstances. Despite the construction profession's role in society, there appears to be a lack of respect as some projects fail. This paper focuses on the need to improve development engineering performance in developing countries, using engineering education in Nigerian universities as a tool for discussion. A purposeful survey, interviews and focus group discussions were conducted on one hundred and twenty (120) prominent companies in Nigeria. The subject is approached through a large number of projects that companies have been involved in from the planning stage, some of which have been completed and even reached the maintenance and monitoring stage. It has been found that certain factors beyond the control of engineers are hindering the full development and success of the construction sector in developing countries. The main culprit is corruption and its eradication will put the country on a stable path to develop construction and combat poverty.Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management
Procedia PDF Downloads 425067 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study
Authors: Natália Botica, Luís Luís, Paulo Bernardes
Abstract:
The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.Keywords: rock art, archaeology, iron age, 3D models
Procedia PDF Downloads 845066 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data
Authors: Minjuan Sun
Abstract:
Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.Keywords: credit score, digital footprint, Fintech, machine learning
Procedia PDF Downloads 1655065 Ripple Effect Analysis of Government Investment for Research and Development by the Artificial Neural Networks
Authors: Hwayeon Song
Abstract:
The long-term purpose of research and development (R&D) programs is to strengthen national competitiveness by developing new knowledge and technologies. Thus, it is important to determine a proper budget for government programs to maintain the vigor of R&D when the total funding is tight due to the national deficit. In this regard, a ripple effect analysis for the budgetary changes in R&D programs is necessary as well as an investigation of the current status. This study proposes a new approach using Artificial Neural Networks (ANN) for both tasks. It particularly focuses on R&D programs related to Construction and Transportation (C&T) technology in Korea. First, key factors in C&T technology are explored to draw impact indicators in three areas: economy, society, and science and technology (S&T). Simultaneously, ANN is employed to evaluate the relationship between data variables. From this process, four major components in R&D including research personnel, expenses, management, and equipment are assessed. Then the ripple effect analysis is performed to see the changes in the hypothetical future by modifying current data. Any research findings can offer an alternative strategy about R&D programs as well as a new analysis tool.Keywords: Artificial Neural Networks, construction and transportation technology, Government Research and Development, Ripple Effect
Procedia PDF Downloads 2495064 Web-Based Decision Support Systems and Intelligent Decision-Making: A Systematic Analysis
Authors: Serhat Tüzün, Tufan Demirel
Abstract:
Decision Support Systems (DSS) have been investigated by researchers and technologists for more than 35 years. This paper analyses the developments in the architecture and software of these systems, provides a systematic analysis for different Web-based DSS approaches and Intelligent Decision-making Technologies (IDT), with the suggestion for future studies. Decision Support Systems literature begins with building model-oriented DSS in the late 1960s, theory developments in the 1970s, and the implementation of financial planning systems and Group DSS in the early and mid-80s. Then it documents the origins of Executive Information Systems, online analytic processing (OLAP) and Business Intelligence. The implementation of Web-based DSS occurred in the mid-1990s. With the beginning of the new millennia, intelligence is the main focus on DSS studies. Web-based technologies are having a major impact on design, development and implementation processes for all types of DSS. Web technologies are being utilized for the development of DSS tools by leading developers of decision support technologies. Major companies are encouraging its customers to port their DSS applications, such as data mining, customer relationship management (CRM) and OLAP systems, to a web-based environment. Similarly, real-time data fed from manufacturing plants are now helping floor managers make decisions regarding production adjustment to ensure that high-quality products are produced and delivered. Web-based DSS are being employed by organizations as decision aids for employees as well as customers. A common usage of Web-based DSS has been to assist customers configure product and service according to their needs. These systems allow individual customers to design their own products by choosing from a menu of attributes, components, prices and delivery options. The Intelligent Decision-making Technologies (IDT) domain is a fast growing area of research that integrates various aspects of computer science and information systems. This includes intelligent systems, intelligent technology, intelligent agents, artificial intelligence, fuzzy logic, neural networks, machine learning, knowledge discovery, computational intelligence, data science, big data analytics, inference engines, recommender systems or engines, and a variety of related disciplines. Innovative applications that emerge using IDT often have a significant impact on decision-making processes in government, industry, business, and academia in general. This is particularly pronounced in finance, accounting, healthcare, computer networks, real-time safety monitoring and crisis response systems. Similarly, IDT is commonly used in military decision-making systems, security, marketing, stock market prediction, and robotics. Even though lots of research studies have been conducted on Decision Support Systems, a systematic analysis on the subject is still missing. Because of this necessity, this paper has been prepared to search recent articles about the DSS. The literature has been deeply reviewed and by classifying previous studies according to their preferences, taxonomy for DSS has been prepared. With the aid of the taxonomic review and the recent developments over the subject, this study aims to analyze the future trends in decision support systems.Keywords: decision support systems, intelligent decision-making, systematic analysis, taxonomic review
Procedia PDF Downloads 2815063 Low Power Glitch Free Dual Output Coarse Digitally Controlled Delay Lines
Authors: K. Shaji Mon, P. R. John Sreenidhi
Abstract:
In deep-submicrometer CMOS processes, time-domain resolution of a digital signal is becoming higher than voltage resolution of analog signals. This claim is nowadays pushing toward a new circuit design paradigm in which the traditional analog signal processing is expected to be progressively substituted by the processing of times in the digital domain. Within this novel paradigm, digitally controlled delay lines (DCDL) should play the role of digital-to-analog converters in traditional, analog-intensive, circuits. Digital delay locked loops are highly prevalent in integrated systems.The proposed paper addresses the glitches present in delay circuits along with area,power dissipation and signal integrity.The digitally controlled delay lines(DCDL) under study have been designed in a 90 nm CMOS technology 6 layer metal Copper Strained SiGe Low K Dielectric. Simulation and synthesis results show that the novel circuits exhibit no glitches for dual output coarse DCDL with less power dissipation and consumes less area compared to the glitch free NAND based DCDL.Keywords: glitch free, NAND-based DCDL, CMOS, deep-submicrometer
Procedia PDF Downloads 2455062 The Quantum Theory of Music and Human Languages
Authors: Mballa Abanda Luc Aurelien Serge, Henda Gnakate Biba, Kuate Guemo Romaric, Akono Rufine Nicole, Zabotom Yaya Fadel Biba, Petfiang Sidonie, Bella Suzane Jenifer
Abstract:
The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original, and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological, and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation, and the question of modeling in the human sciences: mathematics, computer science, translation automation, and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal, and random music. The experimentation confirming the theorization, I designed a semi-digital, semi-analog application that translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music, and deterministic and random music). To test this application, I use music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). The translation is done (from writing to writing, from writing to speech, and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz, and world music or variety, etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: language, music, sciences, quantum entenglement
Procedia PDF Downloads 795061 An Examination of the Relationship between the Five Stages of the Yogacara Path to Enlightenment and the Ten Ox-Herding Pictures
Authors: Kyungbong Kim
Abstract:
This study proposed to compare and analyse the five stages of cultivating the Yogâcāra path and the spiritual journey in the Ten Ox-Herding Pictures. To achieve this, the study investigated the core concepts and practice methods of the two approaches and analysed their relations from the literature reviewed. The results showed that the end goal of the two approaches is the same, the attainment of Buddhahood, with the two having common characteristics including the practice of being aware of the impermanent and non-self, and the fulfilling benefit of sentient beings. The results suggest that our Buddhist practice system needs to sincerely consider the realistic ways by which one can help people in agony in contemporary society, not by emphasizing on the enlightenment through a specific practice way for all people, but by tailored practice methods based on each one's faculties in understanding Buddhism.Keywords: transformation of consciousness to wisdom, enlightenment, the five stages of cultivating the Yogacāra path, the Ten Ox-Herding Pictures, transformation of the basis
Procedia PDF Downloads 2675060 Digital Twin Smart Hospital: A Guide for Implementation and Improvements
Authors: Enido Fabiano de Ramos, Ieda Kanashiro Makiya, Francisco I. Giocondo Cesar
Abstract:
This study investigates the application of Digital Twins (DT) in Smart Hospital Environments (SHE), through a bibliometric study and literature review, including comparison with the principles of Industry 4.0. It aims to analyze the current state of the implementation of digital twins in clinical and non-clinical operations in healthcare settings, identifying trends and challenges, comparing these practices with Industry 4.0 concepts and technologies, in order to present a basic framework including stages and maturity levels. The bibliometric methodology will allow mapping the existing scientific production on the theme, while the literature review will synthesize and critically analyze the relevant studies, highlighting pertinent methodologies and results, additionally the comparison with Industry 4.0 will provide insights on how the principles of automation, interconnectivity and digitalization can be applied in healthcare environments/operations, aiming at improvements in operational efficiency and quality of care. The results of this study will contribute to a deeper understanding of the potential of Digital Twins in Smart Hospitals, in addition to the future potential from the effective integration of Industry 4.0 concepts in this specific environment, presented through the practical framework, after all, the urgent need for changes addressed in this article is undeniable, as well as all their value contribution to human sustainability, designed in SDG3 – Health and well-being: ensuring that all citizens have a healthy life and well-being, at all ages and in all situations. We know that the validity of these relationships will be constantly discussed, and technology can always change the rules of the game.Keywords: digital twin, smart hospital, healthcare operations, industry 4.0, SDG3, technology
Procedia PDF Downloads 555059 Density Measurement of Underexpanded Jet Using Stripe Patterned Background Oriented Schlieren Method
Authors: Shinsuke Udagawa, Masato Yamagishi, Masanori Ota
Abstract:
The Schlieren method, which has been conventionally used to visualize high-speed flows, has disadvantages such as the complexity of the experimental setup and the inability to quantitatively analyze the amount of refraction of light. The Background Oriented Schlieren (BOS) method proposed by Meier is one of the measurement methods that solves the problems, as mentioned above. The refraction of light is used for BOS method same as the Schlieren method. The BOS method is characterized using a digital camera to capture the images of the background behind the observation area. The images are later analyzed by a computer to quantitatively detect the amount of shift of the background image. The experimental setup for BOS does not require concave mirrors, pinholes, or color filters, which are necessary in the conventional Schlieren method, thus simplifying the experimental setup. However, the defocusing of the observation results is caused in case of using BOS method. Since the focus of camera on the background image leads to defocusing of the observed object. The defocusing of object becomes greater with increasing the distance between the background and the object. On the other hand, the higher sensitivity can be obtained. Therefore, it is necessary to adjust the distance between the background and the object to be appropriate for the experiment, considering the relation between the defocus and the sensitivity. The purpose of this study is to experimentally clarify the effect of defocus on density field reconstruction. In this study, the visualization experiment of underexpanded jet using BOS measurement system with ronchi ruling as the background that we constructed, have been performed. The reservoir pressure of the jet and the distance between camera and axis of jet is fixed, and the distance between background and axis of jet has been changed as the parameter. The images have been later analyzed by using personal computer to quantitatively detect the amount of shift of the background image from the comparison between the background pattern and the captured image of underexpanded jet. The quantitatively measured amount of shift have been reconstructed into a density flow field using the Abel transformation and the Gradstone-Dale equation. From the experimental results, it is found that the reconstructed density image becomes blurring, and noise becomes decreasing with increasing the distance between background and axis of underexpanded jet. Consequently, it is cralified that the sensitivity constant should be greater than 20, and the circle of confusion diameter should be less than 2.7mm at least in this experimental setup.Keywords: BOS method, underexpanded jet, abel transformation, density field visualization
Procedia PDF Downloads 805058 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 1625057 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology
Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani
Abstract:
Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography
Procedia PDF Downloads 4255056 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 2775055 Net Regularity and Its Ethical Implications on Internet Stake Holders
Authors: Nourhan Elshenawi
Abstract:
Net Neutrality (NN) is the principle of treating all online data the same without any prioritization of some over others. A research gap in current scholarship about “violations of NN” and the subsequent ethical concerns paves the way for the following research question: To what extent violations of NN entail ethical concerns and implications for Internet stakeholders? To answer this question, NR is examined using the two major action-based ethical theories, Kantian and Utilitarian, across the relevant Internet stakeholders. First some necessary IT background is provided that shapes how the Internet works and who the key stakeholders are. Following the IT background, the relationship between the stakeholders, users, Internet Service Providers (ISPs) and content providers is discussed and illustrated. Then some violations of NN that are currently occurring is covered, without attracting any attention from the general public from an ethical perspective, as a new term Net Regularity (NR). Afterwards, the current scholarship on NN and its violations are discussed, that are mainly from an economic and sociopolitical perspectives to highlight the lack of ethical discussions on the issue. Before moving on to the ethical analysis however, websites are presented as digital entities that are affected by NR and their happiness is measured using functionalism. The analysis concludes that NR is prone to an unethical treatment of Internet stakeholders in the perspective of both theories. Finally, the current Digital Divide in the world is presented to be able to better illustrate the implications of NR. The implications present the new Internet divide that will take place between individuals within society. Through answering the research question using ethical analysis, it attempts to shed some light on the issue of NR and what kind of society it would lead to. NR would not just lead to a divided society, but divided individuals that are separated by something greater than distance, the Internet.Keywords: digital divide, digital entities, digital ontology, internet ethics, internet law, net neutrality, internet service providers, websites as beings
Procedia PDF Downloads 2765054 Conservation and Development of Rural Everyday Landscapes in the Context of Modernization and Transformation
Authors: Xie Weifan, Wang Zhongde
Abstract:
Everyday landscape in the countryside has long played an important role as a cultural representation of the countryside and a link between the countryside and social relations. In the transformation of modernization, the daily landscape in the countryside needs to change with the transformation of daily life in countryside therefore, interpreting the daily landscape in the countryside and understanding the basic characteristics and value perception of the daily landscape from the villagers' perspective can help to understand the daily landscape in the countryside and its conservation and development. Taking Lizi Village in Qianjiang District, Chongqing Municipality, China, as a case study, we collected important daily landscapes in villagers' perceptions through in-depth interviews, categorized them into personal living space, public affairs space, and public activity space, and analyzed the characteristics of the spatial distribution of daily landscapes. The perceptual characteristics of the villagers' perceptions are analyzed and divided into four major types, namely, physical environment perception, atmosphere and culture perception, emotional feelings, and behavioral preferences, and their perceptual characteristics are analyzed respectively to understand the important characteristics of the villagers' perceptions of the daily landscapes. Finally, it is proposed that the protection and development of daily landscape in villages need to improve the mechanism of discovering and evaluating daily landscape, encourage residents to participate in the construction of daily landscape, protect the high-value daily landscape, and promote the innovative development of daily landscape.Keywords: rural landscape, everyday landscape, landscape perception, conservation and development
Procedia PDF Downloads 305053 Transformer Design Optimization Using Artificial Intelligence Techniques
Authors: Zakir Husain
Abstract:
Main objective of a power transformer design optimization problem requires minimizing the total overall cost and/or mass of the winding and core material by satisfying all possible constraints obligatory by the standards and transformer user requirement. The constraints include appropriate limits on winding fill factor, temperature rise, efficiency, no-load current and voltage regulation. The design optimizations tasks are a constrained minimum cost and/or mass solution by optimally setting the parameters, geometry and require magnetic properties of the transformer. In this paper, present the above design problems have been formulated by using genetic algorithm (GA) and simulated annealing (SA) on the MATLAB platform. The importance of the presented approach is stems for two main features. First, proposed technique provides reliable and efficient solution for the problem of design optimization with several variables. Second, it guaranteed to obtained solution is global optimum. This paper includes a demonstration of the application of the genetic programming GP technique to transformer design.Keywords: optimization, power transformer, genetic algorithm (GA), simulated annealing technique (SA)
Procedia PDF Downloads 584