Search results for: hybrid genetic algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4833

Search results for: hybrid genetic algorithms

1863 Application of a Hybrid QFD-FEA Methodology for Nigerian Garment Designs

Authors: Adepeju A. Opaleye, Adekunle Kolawole, Muyiwa A. Opaleye

Abstract:

Consumers’ perceived quality of imported product has been an impediment to business in the Nigeria garment industry. To improve patronage of made- in-Nigeria designs, the first step is to understand what the consumer expects, then proffer ways to meet this expectation through product redesign or improvement of the garment production process. The purpose of this study is to investigate drivers of consumers’ value for typical Nigerian garment design (NGD). An integrated quality function deployment (QFD) and functional, expressive and aesthetic (FEA) Consumer Needs methodology helps to minimize incorrect understanding of potential consumer’s requirements in mass customized garments. Six themes emerged as drivers of consumer’s satisfaction: (1) Style variety (2) Dimensions (3) Finishing (4) Fabric quality (5) Garment Durability and (6) Aesthetics. Existing designs found to lead foreign designs in terms of its acceptance for informal events, style variety and fit. The latter may be linked to its mode of acquisition. A conceptual model of NGD acceptance in the context of consumer’s inherent characteristics, social and the business environment is proposed.

Keywords: Perceived quality, Garment design, Quality function deployment, FEA Model , Mass customisation

Procedia PDF Downloads 123
1862 A Fresh Approach to Learn Evidence-Based Practice, a Prospective Interventional Study

Authors: Ebtehal Qulisy, Geoffrey Dougherty, Kholoud Hothan, Mylene Dandavino

Abstract:

Background: For more than 200 years, journal clubs (JCs) have been used to teach the fundamentals of critical appraisal and evidence-based practice (EBP). However, JCs curricula face important challenges, including poor sustainability, insufficient time to prepare for and conduct the activities, and lack of trainee skills and self-efficacy with critical appraisal. Andragogy principles and modern technology could help EBP be taught in more relevant, modern, and interactive ways. Method: We propose a fresh educational activity to teach EBP. Educational sessions are designed to encourage collaborative and experiential learning and do not require advanced preparation by the participants. Each session lasts 60 minutes and is adaptable to in-person, virtual, or hybrid contexts. Sessions are structured around a worksheet and include three educational objectives: “1. Identify a Clinical Conundrum”, “2. Compare and Contrast Current Guidelines”, and “3. Choose a Recent Journal Article”. Sessions begin with a short presentation by a facilitator of a clinical scenario highlighting a “grey-zone” in pediatrics. Trainees are placed in groups of two to four (based on the participants’ number) of varied training levels. The first task requires the identification of a clinical conundrum (a situation where there is no clear answer but only a reasonable solution) related to the scenario. For the second task, trainees must identify two or three clinical guidelines. The last task requires trainees to find a journal article published in the last year that reports an update regarding the scenario’s topic. Participants are allowed to use their electronic devices throughout the session. Our university provides full-text access to major journals, which facilitated this exercise. Results: Participants were a convenience sample of trainees in the inpatient services at the Montréal Children’s Hospital, McGill University. Sessions were conducted as a part of an existing weekly academic activity and facilitated by pediatricians with experience in critical appraisal. There were 28 participants in 4 sessions held during Spring 2022. Time was allocated at the end of each session to collect participants’ feedback via a self-administered online survey. There were 22 responses, were 41%(n=9) pediatric residents, 22.7%(n=5) family medicine residents, 31.8%(n=7) medical students, and 4.5%(n=1) nurse practitioner. Four respondents participated in more than one session. The “Satisfied” rates were 94.7% for session format, 100% for topic selection, 89.5% for time allocation, and 84.3% for worksheet structure. 60% of participants felt that including the sessions during the clinical ward rotation was “Feasible.” As per self-efficacy, participants reported being “Confident” for the tasks as follows: 89.5% for the ability to identify a relevant conundrum, 94.8% for the compare and contrast task, and 84.2% for the identification of a published update. The perceived effectiveness to learn EBP was reported as “Agreed” by all participants. All participants would recommend this session for further teaching. Conclusion: We developed a modern approach to teach EBP, enjoyed by all levels of participants, who also felt it was a useful learning experience. Our approach addresses known JCs challenges by being relevant to clinical care, fostering active engagement but not requiring any preparation, using available technology, and being adaptable to hybrid contexts.

Keywords: medical education, journal clubs, post-graduate teaching, andragogy, experiential learning, evidence-based practice

Procedia PDF Downloads 104
1861 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue

Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni

Abstract:

Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.

Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM

Procedia PDF Downloads 312
1860 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 174
1859 Effect of Aging Time on CeO2 Nanoparticle Size Distribution Synthesized via Sol-Gel Method

Authors: Navid Zanganeh, Hafez Balavi, Farbod Sharif, Mahla Zabet, Marzieh Bakhtiary Noodeh

Abstract:

Cerium oxide (CeO2) also known as cerium dioxide or ceria is a pale yellow-white powder with various applications in the industry from wood coating to cosmetics, filtration, fuel cell electrolytes, gas sensors, hybrid solar cells and catalysts. In this research, attempts were made to synthesize and characterization of CeO2 nano-particles via sol-gel method. In addition, the effect of aging time on the size of particles was investigated. For this purpose, the aging times adjusted 48, 56, 64, and 72 min. The obtained particles were characterized by x-ray diffraction spectroscopy (XRD), scanning electron microscopy (SEM), transmitted electron microscopy (TEM), and Brunauer–Emmett–Teller (BET). As a result, XRD patterns confirmed the formation of CeO2 nanoparticles. SEM and TEM images illustrated the nano-particles with cluster shape, spherical and a nano-size range which was in agreement with XRD results. The finest particles (7.3 nm) was obtained at the optimum condition which was aging time of 48 min, calcination temperature at 400 ⁰C, and cerium concentration of 0.004 mol. Average specific surface area of the particles at optimum condition was measured by BET analysis and recorded as 47.57 m2/g.

Keywords: aging time, CeO2 nanoparticles, size distribution, sol-gel

Procedia PDF Downloads 440
1858 An Investigation Into an Essential Property of Creativity, Which Is the First-Person Experience

Authors: Ukpaka Paschal

Abstract:

Margret Boden argues that a creative product is one that is new, surprising, and valuable as a result of the combination, exploration, or transformation involved in producing it. Boden uses examples of artificial intelligence systems that fit all of these criteria and argues that real creativity involves autonomy, intentionality, valuation, emotion, and consciousness. This paper provides an analysis of all these elements in order to try to understand whether they are sufficient to account for creativity, especially human creativity. This paper focuses on Generative Adversarial Networks (GANs), which is a class of artificial intelligence algorithms that are said to have disproved the common perception that creativity is something that only humans possess. This paper will then argue that Boden’s listed properties of creativity, which capture the creativity exhibited by GANs, are not sufficient to account for human creativity, and this paper will further identify “first-person phenomenological experience” as an essential property of human creativity. The rationale behind the proposed essential property is that if creativity involves comprehending our experience of the world around us into a form of self-expression, then our experience of the world really matters with regard to creativity.

Keywords: artificial intelligence, creativity, GANs, first-person experience

Procedia PDF Downloads 110
1857 Design and Performance Evaluation of Hybrid Corrugated-GFRP Infill Panels

Authors: Woo Young Jung, Sung Min Park, Ho Young Son, Viriyavudh Sim

Abstract:

This study presents a way to reduce earthquake damage and emergency rehabilitation of critical structures such as schools, high-tech factories, and hospitals due to strong ground motions associated with climate changes. Regarding recent trend, a strong earthquake causes serious damage to critical structures and then the critical structure might be influenced by sequence aftershocks (or tsunami) due to fault plane adjustments. Therefore, in order to improve seismic performance of critical structures, retrofitted or strengthening study of the structures under aftershocks sequence after emergency rehabilitation of the structures subjected to strong earthquakes is widely carried out. Consequently, this study used composite material for emergency rehabilitation of the structure rather than concrete and steel materials because of high strength and stiffness, lightweight, rapid manufacturing, and dynamic performance. Also, this study was to develop or improve the seismic performance or seismic retrofit of critical structures subjected to strong ground motions and earthquake aftershocks, by utilizing GFRP-Corrugated Infill Panels (GCIP).

Keywords: aftershock, composite material, GFRP, infill panel

Procedia PDF Downloads 325
1856 Generalized π-Armendariz Authentication Cryptosystem

Authors: Areej M. Abduldaim, Nadia M. G. Al-Saidi

Abstract:

Algebra is one of the important fields of mathematics. It concerns with the study and manipulation of mathematical symbols. It also concerns with the study of abstractions such as groups, rings, and fields. Due to the development of these abstractions, it is extended to consider other structures, such as vectors, matrices, and polynomials, which are non-numerical objects. Computer algebra is the implementation of algebraic methods as algorithms and computer programs. Recently, many algebraic cryptosystem protocols are based on non-commutative algebraic structures, such as authentication, key exchange, and encryption-decryption processes are adopted. Cryptography is the science that aimed at sending the information through public channels in such a way that only an authorized recipient can read it. Ring theory is the most attractive category of algebra in the area of cryptography. In this paper, we employ the algebraic structure called skew -Armendariz rings to design a neoteric algorithm for zero knowledge proof. The proposed protocol is established and illustrated through numerical example, and its soundness and completeness are proved.

Keywords: cryptosystem, identification, skew π-Armendariz rings, skew polynomial rings, zero knowledge protocol

Procedia PDF Downloads 198
1855 Detecting and Disabling Digital Cameras Using D3CIP Algorithm Based on Image Processing

Authors: S. Vignesh, K. S. Rangasamy

Abstract:

The paper deals with the device capable of detecting and disabling digital cameras. The system locates the camera and then neutralizes it. Every digital camera has an image sensor known as a CCD, which is retro-reflective and sends light back directly to its original source at the same angle. The device shines infrared LED light, which is invisible to the human eye, at a distance of about 20 feet. It then collects video of these reflections with a camcorder. Then the video of the reflections is transferred to a computer connected to the device, where it is sent through image processing algorithms that pick out infrared light bouncing back. Once the camera is detected, the device would project an invisible infrared laser into the camera's lens, thereby overexposing the photo and rendering it useless. Low levels of infrared laser neutralize digital cameras but are neither a health danger to humans nor a physical damage to cameras. We also discuss the simplified design of the above device that can used in theatres to prevent piracy. The domains being covered here are optics and image processing.

Keywords: CCD, optics, image processing, D3CIP

Procedia PDF Downloads 342
1854 Study on Optimization Design of Pressure Hull for Underwater Vehicle

Authors: Qasim Idrees, Gao Liangtian, Liu Bo, Miao Yiran

Abstract:

In order to improve the efficiency and accuracy of the pressure hull structure, optimization of underwater vehicle based on response surface methodology, a method for optimizing the design of pressure hull structure was studied. To determine the pressure shell of five dimensions as a design variable, the application of thin shell theory and the Chinese Classification Society (CCS) specification was carried on the preliminary design. In order to optimize variables of the feasible region, different methods were studied and implemented such as Opt LHD method (to determine the design test sample points in the feasible domain space), parametric ABAQUS solution for each sample point response, and the two-order polynomial response for the surface model of the limit load of structures. Based on the ultimate load of the structure and the quality of the shell, the two-generation genetic algorithm was used to solve the response surface, and the Pareto optimal solution set was obtained. The final optimization result was 41.68% higher than that of the initial design, and the shell quality was reduced by about 27.26%. The parametric method can ensure the accuracy of the test and improve the efficiency of optimization.

Keywords: parameterization, response surface, structure optimization, pressure hull

Procedia PDF Downloads 214
1853 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems

Authors: Belkacem Laimouche

Abstract:

With the field of artificial intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.

Keywords: artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, interlaboratory comparison, data analysis, data reliability, measurement of bias impact on predictions, improvement of model accuracy and reliability

Procedia PDF Downloads 92
1852 Scientific Recommender Systems Based on Neural Topic Model

Authors: Smail Boussaadi, Hassina Aliane

Abstract:

With the rapid growth of scientific literature, it is becoming increasingly challenging for researchers to keep up with the latest findings in their fields. Academic, professional networks play an essential role in connecting researchers and disseminating knowledge. To improve the user experience within these networks, we need effective article recommendation systems that provide personalized content.Current recommendation systems often rely on collaborative filtering or content-based techniques. However, these methods have limitations, such as the cold start problem and difficulty in capturing semantic relationships between articles. To overcome these challenges, we propose a new approach that combines BERTopic (Bidirectional Encoder Representations from Transformers), a state-of-the-art topic modeling technique, with community detection algorithms in a academic, professional network. Experiences confirm our performance expectations by showing good relevance and objectivity in the results.

Keywords: scientific articles, community detection, academic social network, recommender systems, neural topic model

Procedia PDF Downloads 80
1851 Developing Fault Tolerance Metrics of Web and Mobile Applications

Authors: Ahmad Mohsin, Irfan Raza Naqvi, Syda Fatima Usamn

Abstract:

Applications with higher fault tolerance index are considered more reliable and trustworthy to drive quality. In recent years application development has been shifted from traditional desktop and web to native and hybrid application(s) for the web and mobile platforms. With the emergence of Internet of things IOTs, cloud and big data trends, the need for measuring Fault Tolerance for these complex nature applications has increased to evaluate their performance. There is a phenomenal gap between fault tolerance metrics development and measurement. Classic quality metric models focused on metrics for traditional systems ignoring the essence of today’s applications software, hardware & deployment characteristics. In this paper, we have proposed simple metrics to measure fault tolerance considering general requirements for Web and Mobile Applications. We have aligned factors – subfactors, using GQM for metrics development considering the nature of mobile we apps. Systematic Mathematical formulation is done to measure metrics quantitatively. Three web mobile applications are selected to measure Fault Tolerance factors using formulated metrics. Applications are then analysed on the basis of results from observations in a controlled environment on different mobile devices. Quantitative results are presented depicting Fault tolerance in respective applications.

Keywords: web and mobile applications, reliability, fault tolerance metric, quality metrics, GQM based metrics

Procedia PDF Downloads 327
1850 Hybridized Simulated Annealing with Chemical Reaction Optimization for Solving to Sequence Alignment Problem

Authors: Ernesto Linan, Linda Cruz, Lucero Becerra

Abstract:

In this paper, a new hybridized algorithm based on Chemical Reaction Optimization and Simulated Annealing is proposed to solve the alignment sequence Problem. The Chemical Reaction Optimization is a population-based meta-heuristic algorithm based on the principles of a chemical reaction. Simulated Annealing is applied to solve a large number of combinatorial optimization problems of general-purpose. In this paper, we propose hybridization between Chemical Reaction Optimization algorithm and Simulated Annealing in order to solve the Sequence Alignment Problem. An initial population of molecules is defined at beginning of the proposed algorithm, where each molecule represents a sequence alignment problem. In order to simulate inter-molecule collisions, the process of Chemical Reaction is placed inside the Metropolis Cycle at certain values of temperature. Inside this cycle, change of molecules is done due to collisions; some molecules are accepted by applying Boltzmann probability. The results with the hybrid scheme are better than the results obtained separately.

Keywords: chemical reaction optimization, sequence alignment problem, simulated annealing algorithm, metaheuristics

Procedia PDF Downloads 197
1849 Human Performance Evaluating of Advanced Cardiac Life Support Procedure Using Fault Tree and Bayesian Network

Authors: Shokoufeh Abrisham, Seyed Mahmoud Hossieni, Elham Pishbin

Abstract:

In this paper, a hybrid method based on the fault tree analysis (FTA) and Bayesian networks (BNs) are employed to evaluate the team performance quality of advanced cardiac life support (ACLS) procedures in emergency department. According to American Heart Association (AHA) guidelines, a category relying on staff action leading to clinical incidents and also some discussions with emergency medicine experts, a fault tree model for ACLS procedure is obtained based on the human performance. The obtained FTA model is converted into BNs, and some different scenarios are defined to demonstrate the efficiency and flexibility of the presented model of BNs. Also, a sensitivity analysis is conducted to indicate the effects of team leader presence and uncertainty knowledge of experts on the quality of ACLS. The proposed model based on BNs shows that how the results of risk analysis can be closed to reality comparing to the obtained results based on only FTA in medical procedures.

Keywords: advanced cardiac life support, fault tree analysis, Bayesian belief networks, numan performance, healthcare systems

Procedia PDF Downloads 133
1848 Structured Cross System Planning and Control in Modular Production Systems by Using Agent-Based Control Loops

Authors: Simon Komesker, Achim Wagner, Martin Ruskowski

Abstract:

In times of volatile markets with fluctuating demand and the uncertainty of global supply chains, flexible production systems are the key to an efficient implementation of a desired production program. In this publication, the authors present a holistic information concept taking into account various influencing factors for operating towards the global optimum. Therefore, a strategy for the implementation of multi-level planning for a flexible, reconfigurable production system with an alternative production concept in the automotive industry is developed. The main contribution of this work is a system structure mixing central and decentral planning and control evaluated in a simulation framework. The information system structure in current production systems in the automotive industry is rigidly hierarchically organized in monolithic systems. The production program is created rule-based with the premise of achieving uniform cycle time. This program then provides the information basis for execution in subsystems at the station and process execution level. In today's era of mixed-(car-)model factories, complex conditions and conflicts arise in achieving logistics, quality, and production goals. There is no provision for feedback loops of results from the process execution level (resources) and process supporting (quality and logistics) systems and reconsideration in the planning systems. To enable a robust production flow, the complexity of production system control is artificially reduced by the line structure and results, for example in material-intensive processes (buffers and safety stocks - two container principle also for different variants). The limited degrees of freedom of line production have produced the principle of progress figure control, which results in one-time sequencing, sequential order release, and relatively inflexible capacity control. As a result, modularly structured production systems such as modular production according to known approaches with more degrees of freedom are currently difficult to represent in terms of information technology. The remedy is an information concept that supports cross-system and cross-level information processing for centralized and decentralized decision-making. Through an architecture of hierarchically organized but decoupled subsystems, the paradigm of hybrid control is used, and a holonic manufacturing system is offered, which enables flexible information provisioning and processing support. In this way, the influences from quality, logistics, and production processes can be linked holistically with the advantages of mixed centralized and decentralized planning and control. Modular production systems also require modularly networked information systems with semi-autonomous optimization for a robust production flow. Dynamic prioritization of different key figures between subsystems should lead the production system to an overall optimum. The tasks and goals of quality, logistics, process, resource, and product areas in a cyber-physical production system are designed as an interconnected multi-agent-system. The result is an alternative system structure that executes centralized process planning and decentralized processing. An agent-based manufacturing control is used to enable different flexibility and reconfigurability states and manufacturing strategies in order to find optimal partial solutions of subsystems, that lead to a near global optimum for hybrid planning. This allows a robust near to plan execution with integrated quality control and intralogistics.

Keywords: holonic manufacturing system, modular production system, planning, and control, system structure

Procedia PDF Downloads 160
1847 Daylightophil Approach towards High-Performance Architecture for Hybrid-Optimization of Visual Comfort and Daylight Factor in BSk

Authors: Mohammadjavad Mahdavinejad, Hadi Yazdi

Abstract:

The greatest influence we have from the world is shaped through the visual form, thus light is an inseparable element in human life. The use of daylight in visual perception and environment readability is an important issue for users. With regard to the hazards of greenhouse gas emissions from fossil fuels, and in line with the attitudes on the reduction of energy consumption, the correct use of daylight results in lower levels of energy consumed by artificial lighting, heating and cooling systems. Windows are usually the starting points for analysis and simulations to achieve visual comfort and energy optimization; therefore, attention should be paid to the orientation of buildings to minimize electrical energy and maximize the use of daylight. In this paper, by using the Design Builder Software, the effect of the orientation of an 18m2(3m*6m) room with 3m height in city of Tehran has been investigated considering the design constraint limitations. In these simulations, the dimensions of the building have been changed with one degree and the window is located on the smaller face (3m*3m) of the building with 80% ratio. The results indicate that the orientation of building has a lot to do with energy efficiency to meet high-performance architecture and planning goals and objectives.

Keywords: daylight, window, orientation, energy consumption, design builder

Procedia PDF Downloads 215
1846 Using A Blockchain-Based, End-to-End Encrypted Communication System Between Mobile Terminals to Improve Organizational Privacy

Authors: Andrei Bogdan Stanescu, Robert Stana

Abstract:

Creating private and secure communication channels between employees has become a critical aspect in order to ensure organizational integrity and avoid leaks of sensitive information. With the widespread use of modern methods of disrupting communication between users, real use-cases of advanced encryption mechanisms have emerged to avoid cyber-attackers that are willing to intercept private conversations between critical employees in an organization. This paper aims to present a custom implementation of a messaging application named “Whisper” that uses end-to-end encryption (E2EE) mechanisms and blockchain-related components to protect sensitive conversations and mitigate the risks of information breaches inside organizations. The results of this research paper aim to expand the areas of applicability of E2EE algorithms and integrations with private blockchains in chat applications as a viable method of enhancing intra-organizational communication privacy.

Keywords: end-to-end encryption, mobile communication, cryptography, communication security, data privacy

Procedia PDF Downloads 68
1845 The Role of Artificial Intelligence Algorithms in Decision-Making Policies

Authors: Marisa Almeida AraúJo

Abstract:

Artificial intelligence (AI) tools are being used (including in the criminal justice system) and becomingincreasingly popular. The many questions that these (future) super-beings pose the neuralgic center is rooted in the (old) problematic between rationality and morality. For instance, if we follow a Kantian perspective in which morality derives from AI, rationality will also surpass man in ethical and moral standards, questioning the nature of mind, the conscience of self and others, and moral. The recognition of superior intelligence in a non-human being puts us in the contingency of having to recognize a pair in a form of new coexistence and social relationship. Just think of the humanoid robot Sophia, capable of reasoning and conversation (and who has been recognized for Saudi citizenship; a fact that symbolically demonstrates our empathy with the being). Machines having a more intelligent mind, and even, eventually, with higher ethical standards to which, in the alluded categorical imperative, we would have to subject ourselves under penalty of contradiction with the universal Kantian law. Recognizing the complex ethical and legal issues and the significant impact on human rights and democratic functioning itself is the goal of our work.

Keywords: ethics, artificial intelligence, legal rules, principles, philosophy

Procedia PDF Downloads 181
1844 Designing Sustainable Building Based on Iranian's Windmills

Authors: Negar Sartipzadeh

Abstract:

Energy-conscious design, which coordinates with the Earth ecological systems during its life cycle, has the least negative impact on the environment with the least waste of resources. Due to the increasing in world population as well as the consumption of fossil fuels that cause the production of greenhouse gasses and environmental pollution, mankind is looking for renewable and also sustainable energies. The Iranian native construction is a clear evidence of energy-aware designing. Our predecessors were forced to rely on the natural resources and sustainable energies as well as environmental issues which have been being considered in the recent world. One of these endless energies is wind energy. Iranian traditional architecture foundations is a appropriate model in solving the environmental crisis and the contemporary energy. What will come in this paper is an effort to recognition and introduction of the unique characteristics of the Iranian architecture in the application of aerodynamic and hydraulic energies derived from the wind, which are the most common and major type of using sustainable energies in the traditional architecture of Iran. Therefore, the recent research attempts to offer a hybrid system suggestions for application in new constructions designing in a region such as Nashtifan, which has potential through reviewing windmills and how they deal with sustainable energy sources, as a model of Iranian native construction.

Keywords: renewable energy, sustainable building, windmill, Iranian architecture

Procedia PDF Downloads 404
1843 The Influence of Polymorphisms of NER System Genes on the Risk of Colorectal Cancer in the Polish Population

Authors: Ireneusz Majsterek, Karolina Przybylowska, Lukasz Dziki, Adam Dziki, Jacek Kabzinski

Abstract:

Colorectal cancer (CRC) is one of the deadliest cancers. Every year we see an increase in the number of cases, and in spite of intensive research etiology of the disease remains unknown. For many years, researchers are seeking to associate genetic factors with an increased risk of CRC, so far it has proved to be a compelling link between the MMR system of DNA repair and hereditary nonpolyposis colorectal cancers (HNPCC). Currently, research is focused on finding the relationship between the remaining DNA repair systems and an increased risk of developing colorectal cancer. The aim of the study was to determine the relationship between gene polymorphisms Ser835Ser of XPF gene and Gly23Ala of XPA gene–elements of NER DNA repair system, and modulation of the risk of colorectal cancer in the Polish population. Determination of the molecular basis of carcinogenesis process and predicting increased risk will allow qualifying patients to increased risk group and including them in preventive program. We used blood collected from 110 patients diagnosed with colorectal cancer. The control group consisted of equal number of healthy people. Genotyping was performed by TaqMan method. The obtained results indicate that the genotype 23Gly/Ala of XPA gene is associated with an increased risk of colorectal cancer, while 23Ala/Ala as well as TCT allele of Ser835Ser of XPF gene may reduce the risk of CRC.

Keywords: NER, colorectal cancer, XPA, XPF, polymorphisms

Procedia PDF Downloads 551
1842 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem

Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury

Abstract:

Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.

Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem

Procedia PDF Downloads 351
1841 Phylogenetic Analysis of the Thunnus Tuna Fish Using Cytochrome C Oxidase Subunit I Gene Sequence

Authors: Yijun Lai, Saber Khederzadeh, Lingshaung Han

Abstract:

Species in Thunnus are organized due to the similarity between them. The closeness between T. maccoyii, T. thynnus, T. Tonggol, T. atlanticus, T. albacares, T. obsesus, T. alalunga, and T. orientails are in different degrees. However, the genetic pattern of differentiation has not been presented based on individuals yet, to the author’s best knowledge. Hence, we aimed to analyze the difference in individuals level of tuna species to identify the factors that contribute to the maternal lineage variety using Cytochrome c oxidase subunit I (COXI) gene sequences. Our analyses provided evidence of sharing lineages in the Thunnus. A phylogenetic analysis revealed that these lineages are basal to the other sequences. We also showed a close connection between the T. tonggol, T. thynnus, and T. albacares populations. Also, the majority of the T. orientalis samples were clustered with the T. alalunga and, then, T. atlanticus populations. Phylogenetic trees and migration modeling revealed high proximity of T. thynnus sequences to a few T. orientalis and suggested possible gene flow with T. tonggol and T. albacares lineages, while all T. obsesus samples indicated unique clustering with each other. Our results support the presence of old maternal lineages in Thunnus, as a legacy of an ancient wave of colonization or migration.

Keywords: Thunnus Tuna, phylogeny, maternal lineage, COXI gene

Procedia PDF Downloads 272
1840 Housing Price Prediction Using Machine Learning Algorithms: The Case of Melbourne City, Australia

Authors: The Danh Phan

Abstract:

House price forecasting is a main topic in the real estate market research. Effective house price prediction models could not only allow home buyers and real estate agents to make better data-driven decisions but may also be beneficial for the property policymaking process. This study investigates the housing market by using machine learning techniques to analyze real historical house sale transactions in Australia. It seeks useful models which could be deployed as an application for house buyers and sellers. Data analytics show a high discrepancy between the house price in the most expensive suburbs and the most affordable suburbs in the city of Melbourne. In addition, experiments demonstrate that the combination of Stepwise and Support Vector Machine (SVM), based on the Mean Squared Error (MSE) measurement, consistently outperforms other models in terms of prediction accuracy.

Keywords: house price prediction, regression trees, neural network, support vector machine, stepwise

Procedia PDF Downloads 206
1839 DNA PLA: A Nano-Biotechnological Programmable Device

Authors: Hafiz Md. HasanBabu, Khandaker Mohammad Mohi Uddin, Md. IstiakJaman Ami, Rahat Hossain Faisal

Abstract:

Computing in biomolecular programming performs through the different types of reactions. Proteins and nucleic acids are used to store the information generated by biomolecular programming. DNA (Deoxyribose Nucleic Acid) can be used to build a molecular computing system and operating system for its predictable molecular behavior property. The DNA device has clear advantages over conventional devices when applied to problems that can be divided into separate, non-sequential tasks. The reason is that DNA strands can hold so much data in memory and conduct multiple operations at once, thus solving decomposable problems much faster. Programmable Logic Array, abbreviated as PLA is a programmable device having programmable AND operations and OR operations. In this paper, a DNA PLA is designed by different molecular operations using DNA molecules with the proposed algorithms. The molecular PLA could take advantage of DNA's physical properties to store information and perform calculations. These include extremely dense information storage, enormous parallelism, and extraordinary energy efficiency.

Keywords: biological systems, DNA computing, parallel computing, programmable logic array, PLA, DNA

Procedia PDF Downloads 111
1838 Exploitation of Variability for Salinity Tolerance in Maize Hybrids (Zea Mays L.) at Early Growth Stage

Authors: Abdul Qayyum, Hafiz Muhammad Saeed, Mamoona Hanif, Etrat Noor, Waqas Malik, Shoaib Liaqat

Abstract:

Salinity is extremely serious problem that has a drastic effect on maize crop, environment and causes economic losses of country. An advance technique to overcome salinity is to develop salt tolerant geno types which require screening of huge germplasm to start a breeding program. Therefore, present study was undertaken to screen out 25 maize hybrids of different origin for salinity tolerance at seedling stage under three levels of salt stress 250 and 300 mM NaCl including one control. The existence of variation for tolerance to enhanced NaCl salinity levels at seedling stage in maize proved that hybrids had differing ability to grow under saline environment and potential variability within specie. Almost all the twenty five maize hybrids behaved varyingly in response to different salinity levels. However, the maize hybrids H6, H13, H21, H23 and H24 expressed better performance under salt stress in terms of all six characters and proved to be as highly tolerant while H22, H17 H20, H18, H4, H9, and H8 were identified as moderately tolerant. Hybrids H14, H5, H11 and H3 H12, H2, were expressed as most sensitive to salinity suggesting that screening is an effective tool to exploit genetic variation among maize hybrids and salt tolerance in maize can be enhanced through selection and breeding procedure.

Keywords: salinity, hybrids, maize, variation

Procedia PDF Downloads 696
1837 HPPDFIM-HD: Transaction Distortion and Connected Perturbation Approach for Hierarchical Privacy Preserving Distributed Frequent Itemset Mining over Horizontally-Partitioned Dataset

Authors: Fuad Ali Mohammed Al-Yarimi

Abstract:

Many algorithms have been proposed to provide privacy preserving in data mining. These protocols are based on two main approaches named as: the perturbation approach and the Cryptographic approach. The first one is based on perturbation of the valuable information while the second one uses cryptographic techniques. The perturbation approach is much more efficient with reduced accuracy while the cryptographic approach can provide solutions with perfect accuracy. However, the cryptographic approach is a much slower method and requires considerable computation and communication overhead. In this paper, a new scalable protocol is proposed which combines the advantages of the perturbation and distortion along with cryptographic approach to perform privacy preserving in distributed frequent itemset mining on horizontally distributed data. Both the privacy and performance characteristics of the proposed protocol are studied empirically.

Keywords: anonymity data, data mining, distributed frequent itemset mining, gaussian perturbation, perturbation approach, privacy preserving data mining

Procedia PDF Downloads 490
1836 Approximation of Convex Set by Compactly Semidefinite Representable Set

Authors: Anusuya Ghosh, Vishnu Narayanan

Abstract:

The approximation of convex set by semidefinite representable set plays an important role in semidefinite programming, especially in modern convex optimization. To optimize a linear function over a convex set is a hard problem. But optimizing the linear function over the semidefinite representable set which approximates the convex set is easy to solve as there exists numerous efficient algorithms to solve semidefinite programming problems. So, our approximation technique is significant in optimization. We develop a technique to approximate any closed convex set, say K by compactly semidefinite representable set. Further we prove that there exists a sequence of compactly semidefinite representable sets which give tighter approximation of the closed convex set, K gradually. We discuss about the convergence of the sequence of compactly semidefinite representable sets to closed convex set K. The recession cone of K and the recession cone of the compactly semidefinite representable set are equal. So, we say that the sequence of compactly semidefinite representable sets converge strongly to the closed convex set. Thus, this approximation technique is very useful development in semidefinite programming.

Keywords: semidefinite programming, semidefinite representable set, compactly semidefinite representable set, approximation

Procedia PDF Downloads 367
1835 Antimicrobial Peptide Produced by Lactococcus garvieae with a Broad Inhibition Spectrum

Authors: Hai Chi, Ibrahim Mehmeti, Kirill Ovchinnikov, Hegle Holo, Ingolf F. Nes, Dzung B. Diep

Abstract:

By using a panel of multiple indicator strains of different bacterial species and genera, we screened a large collection of bacterial isolates (over 1800 isolates) derived from raw milk, for bacteriocin producers with broad inhibition spectra (BIS). Fourteen isolates with BIS were identified, and by 16S rDNA sequencing they were found to belong to Lactococcus garvieae (10 isolates) and Enterococcus feacalis (4 isolates). Further analysis of the ten L. garvieae isolates revealed that they were very similar, if not identical, to each other in metabolic and genetic terms: they had the same fermentation profile on different types of sugars, repetitive sequence-based PCR (rep-PCR) DNA pattern as well as they all had the same inhibition profile towards over 50 isolates of different species. The bacteriocin activity from one of the L. garvieae isolates was assessed further. The bacteriocin which was termed garvicin KS, was found to be heatstable and proteinase-labile and its inhibition spectrum contained many distantly related genera of Firmicutes, comprising most lactic acid bacteria (LAB) as well as problematic species of Bacillus, Listeria, Streptococcus and Staphylococcus and their antibiotic resistant derivatives (e.g. VRE, MRSA). Taken together, the results indicate that this is a potent bacteriocin from L. garvieae and that its very broad inhibition spectrum can be a very useful property for use in food preservation as well as in infection treatments caused by gram-positive pathogens and their antibiotic-derivatives.

Keywords: bacteriocin, lactic acid bacteria, Lactococcus garvieae, antibiotics resistance

Procedia PDF Downloads 226
1834 Studies on Mechanical Behavior of Kevlar/Kenaf/Graphene Reinforced Polymer Based Hybrid Composites

Authors: H. K. Shivanand, Ranjith R. Hombal, Paraveej Shirahatti, Gujjalla Anil Babu, S. ShivaPrakash

Abstract:

When it comes to the selection of materials the knowledge of materials science plays a vital role in selection and enhancements of materials properties. In the world of material science a composite material has the significant role based on its application. The composite materials are those in which two or more components having different physical and chemical properties are combined to create a new enhanced property substance. In this study three different materials (Kenaf, Kevlar and Graphene) been chosen based on their properties and a composite material is developed with help of vacuum bagging process. The fibers (Kenaf and Kevlar) and Resin(vinyl ester) ratio was maintained at 70:30 during the process and 0.5% 1% and 1.5% of Graphene was added during fabrication process. The material was machined to thedimension ofASTM standards(300×300mm and thickness 3mm)with help of water jet cutting machine. The composite materials were tested for Mechanical properties such as Interlaminar shear strength(ILSS) and Flexural strength. It is found that there is significant increase in material properties in the developed composite material.

Keywords: Kevlar, Kenaf, graphene, vacuum bagging process, Interlaminar shear strength test, flexural test

Procedia PDF Downloads 71