Search results for: LCA tools and data
27158 Applications of Forensics/DNA Tools in Combating Gender-Based Violence: A Case Study in Nigeria
Authors: Edeaghe Ehikhamenor, Jennifer Nnamdi
Abstract:
Introduction: Gender-based violence (GBV) was a well-known global crisis before the COVID-19 pandemic. The pandemic burden only intensified the crisis. With prevailing lockdowns, increased poverty due to high unemployment, especially affecting females, and other mobility restrictions that have left many women trapped with their abusers, plus isolation from social contact and support networks, GBV cases spiraled out of control. Prevalence of economic with cultural disparity, which is greatly manifested in Nigeria, is a major contributory factor to GBV. This is made worst by religious adherents where the females are virtually relegated to the background. Our societal approaches to investigations and sanctions to culprits have not sufficiently applied forensic/DNA tools in combating these major vices. Violence against women or some rare cases against men can prevent them from carrying out their duties regardless of the position they hold. Objective: The main objective of this research is to highlight the origin of GBV, the victims, types, contributing factors, and the applications of forensics/DNA tools and remedies so as to minimize GBV in our society. Methods: Descriptive information was obtained through the search on our daily newspapers, electronic media, google scholar websites, other authors' observations and personal experiences, plus anecdotal reports. Results: Findings from our exploratory searches revealed a high incidence of GBV with very limited or no applications of Forensics/DNA tools as an intervening mechanism to reduce GBV in Nigeria. Conclusion: Nigeria needs to develop clear-cut policies on forensics/DNA tools in terms of institutional framework to develop a curriculum for the training of all stakeholders to fast-track justice for victims of GBV so as to serve as a deterrent to other culprits.Keywords: gender-based violence, forensics, DNA, justice
Procedia PDF Downloads 8427157 The Significant Effect of Wudu’ and Zikr in the Controlling of Emotional Pressure Using Biofeedback Emwave Technique
Authors: Mohd Anuar Awang Idris, Muhammad Nubli Abdul Wahab, Nora Yusma Mohamed Yusoff
Abstract:
Wudu’ (Ablution) and Zikr are amongst some of the spiritual tools which may help an individual control his mind, emotion and attitude. These tools are deemed to be able to deliver a positive impact on an individual’s psychophysiology. The main objective of this research is to determine the effects of Wudu’ (Ablution) and Zikr therapy using the biofeedback emWave application and technology. For this research, 13 students were selected as samples from the students’ representative body at the University Tenaga National, Malaysia. The DASS (Depression Anxiety Stress Scale) questionnaire was used to help with the assessment and measurement of each student’s ability in controlling his or her emotions before and after the therapies. The biofeedback emWave technology was utilized to monitor the student’s psychophysiology level. In addition, the data obtained from the Heart rate variability (HRV) test have also been used to affirm that Wudu’ and Zikr had had significant impacts on the student’s success in controlling his or her emotional pressure.Keywords: biofeedback EmWave, emotion, psychophysiology, wudu’, zikr
Procedia PDF Downloads 20527156 July 15 Coup Attempt and the Use of New Communication Technologies
Authors: Yasemin Gulsen Yilmaz, Suleyman Hakan Yilmaz, Muhammet Erbay
Abstract:
The new communication technologies have gradually improved its efficiency in all fields of life and made its presence irreplaceable. These technologies which appear in every aspect of life differently showed itself during the failed coup attempt in Turkey too. The evening of July 15, 2016, have already taken its place in the Turkish political history. In the evening of July 15, Turkish nation confronted to a coup attempted by a group within the Turkish Armed Forces. That evening, the scene of the confrontation between the coup attempters and the resisting civilians were watched minute-by-minute by the people using the new communication technologies. Pro-coup soldiers and the resisting groups that came face to face in the streets of metropolitan cities, made their in-group communications by using new media tools very actively. New media turned into the most important weapon both for coup plotters and for those who resisted. In the morning of next day, whoever used these tools better had the upper hand. The civilians were successful in protecting democracy not only by resisting against tanks and bullets but also by following the internet, organising in social media, sharing information-photos on the net and telling large masses their experiences through these technologies. In this study, we focused on and analysed the use of new media both by coup soldiers and resisting civilians during the failed coup attempt in July 15. Within the scope of this study, coup attempt news that took place in printed media within one week were examined; the information about the use of new media tools during the night of failed coup were compiled; and it was determined how, to what extend and what for these tools were used and how effective they were.Keywords: communication, July 15, new media, media
Procedia PDF Downloads 24327155 Complementing Assessment Processes with Standardized Tests: A Work in Progress
Authors: Amparo Camacho
Abstract:
ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.Keywords: assessment, hard skills, soft skills, standardized tests
Procedia PDF Downloads 28427154 Comparison of Irradiance Decomposition and Energy Production Methods in a Solar Photovoltaic System
Authors: Tisciane Perpetuo e Oliveira, Dante Inga Narvaez, Marcelo Gradella Villalva
Abstract:
Installations of solar photovoltaic systems have increased considerably in the last decade. Therefore, it has been noticed that monitoring of meteorological data (solar irradiance, air temperature, wind velocity, etc.) is important to predict the potential of a given geographical area in solar energy production. In this sense, the present work compares two computational tools that are capable of estimating the energy generation of a photovoltaic system through correlation analyzes of solar radiation data: PVsyst software and an algorithm based on the PVlib package implemented in MATLAB. In order to achieve the objective, it was necessary to obtain solar radiation data (measured and from a solarimetric database), analyze the decomposition of global solar irradiance in direct normal and horizontal diffuse components, as well as analyze the modeling of the devices of a photovoltaic system (solar modules and inverters) for energy production calculations. Simulated results were compared with experimental data in order to evaluate the performance of the studied methods. Errors in estimation of energy production were less than 30% for the MATLAB algorithm and less than 20% for the PVsyst software.Keywords: energy production, meteorological data, irradiance decomposition, solar photovoltaic system
Procedia PDF Downloads 14227153 Social Media as an Interactive Learning Tool Applied to Faculty of Tourism and Hotels, Fayoum University
Authors: Islam Elsayed Hussein
Abstract:
The aim of this paper is to discover the impact of students’ attitude towards social media and the skills required to adopt social media as a university e-learning (2.0) platform. In addition, it measures the effect of social media adoption on interactive learning effectiveness. The population of this study was students at Faculty of tourism and Hotels, Fayoum University. A questionnaire was used as a research instrument to collect data from respondents, which had been selected randomly. Data had been analyzed using quantitative data analysis method. Findings showed that the students have a positive attitude towards adopting social networking in the learning process and they have also good skills for effective use of social networking tools. In addition, adopting social media is effectively affecting the interactive learning environment.Keywords: attitude, skills, e-learning 2.0, interactive learning, Egypt
Procedia PDF Downloads 52327152 Social and Educational AI for Diversity: Research on Democratic Values to Develop Artificial Intelligence Tools to Guarantee Access for all to Educational Tools and Public Services
Authors: Roberto Feltrero, Sara Osuna-Acedo
Abstract:
Responsible Research and Innovation have to accomplish one fundamental aim: everybody has to participate in the benefits of innovation, but also innovation has to be democratic; that is to say, everybody may have the possibility to participate in the decisions in the innovation process. Particularly, a democratic and inclusive model of social participation and innovation includes persons with disabilities and people at risk of discrimination. Innovations on Artificial Intelligence for social development have to accomplish the same dual goal: improving equality for accessing fields of public interest like education, training and public services, as well as improving civic and democratic participation in the process of developing such innovations for all. This research aims to develop innovations, policies and policy recommendations to apply and disseminate such artificial intelligence and social model for making educational and administrative processes more accessible. First, designing a citizen participation process to engage citizens in the designing and use of artificial intelligence tools for public services. This will result in improving trust in democratic institutions contributing to enhancing the transparency, effectiveness, accountability and legitimacy of public policy-making and allowing people to participate in the development of ethical standards for the use of such technologies. Second, improving educational tools for lifelong learning with AI models to improve accountability and educational data management. Dissemination, education and social participation will be integrated, measured and evaluated in innovative educational processes to make accessible all the educational technologies and content developed on AI about responsible and social innovation. A particular case will be presented regarding access for all to educational tools and public services. This accessibility requires cognitive adaptability because, many times, legal or administrative language is very complex. Not only for people with cognitive disabilities but also for old people or citizens at risk of educational or social discrimination. Artificial Intelligence natural language processing technologies can provide tools to translate legal, administrative, or educational texts to a more simple language that can be accessible to everybody. Despite technological advances in language processing and machine learning, this becomes a huge project if we really want to respect ethical and legal consequences because that kinds of consequences can only be achieved with civil and democratic engagement in two realms: 1) to democratically select texts that need and can be translated and 2) to involved citizens, experts and nonexperts, to produce and validate real examples of legal texts with cognitive adaptations to feed artificial intelligence algorithms for learning how to translate those texts to a more simple and accessible language, adapted to any kind of population.Keywords: responsible research and innovation, AI social innovations, cognitive accessibility, public participation
Procedia PDF Downloads 8827151 FPGA Implementation of RSA Encryption Algorithm for E-Passport Application
Authors: Khaled Shehata, Hanady Hussien, Sara Yehia
Abstract:
Securing the data stored on E-passport is a very important issue. RSA encryption algorithm is suitable for such application with low data size. In this paper the design and implementation of 1024 bit-key RSA encryption and decryption module on an FPGA is presented. The module is verified through comparing the result with that obtained from MATLAB tools. The design runs at a frequency of 36.3 MHz on Virtex-5 Xilinx FPGA. The key size is designed to be 1024-bit to achieve high security for the passport information. The whole design is achieved through VHDL design entry which makes it a portable design and can be directed to any hardware platform.Keywords: RSA, VHDL, FPGA, modular multiplication, modular exponential
Procedia PDF Downloads 38927150 China's Soft Power and Its Strategy in West Asia
Authors: Iman Shabanzadeh
Abstract:
The economic growth and the special model of development in China have caused sensitivity in the public opinion of the world regarding the nature of this growth and development. In this regard, the Chinese have tried to put an end to such alarming procedures by using all the tools at their disposal and seek to present a peaceful and cooperative image of themselves. In this way, one of the most important diplomatic tools that Beijing has used to reduce the concerns caused by the Threat Theory has been the use of soft power resources and its tools in its development policies. This article begins by analyzing the concept of soft power and examining its foundations in international relations, and continues to examine the components of soft power in its Chinese version. The main purpose of the article is to figure out about the position of West Asia in China's soft power strategy and resources China use to achieve its goals in this region. In response to the main question, the paper's hypothesis is that soft power in its Chinese version had significant differences from Joseph Nye's original idea. In fact, the Chinese have imported the American version of soft power and adjusted, strengthened and, in other words, internalized it with their abilities, capacities and political philosophy. Based on this, China's software presence in West Asia can be traced in three areas. The first source of China's soft power in this region of West Asia is cultural in nature and is realized through strategies such as "use of educational tools and methods", "media methods" and "tourism industry". The second source is related to political soft power, which is applied through the policy of "balance of influence" and the policy of "mediation" and relying on the "ideological foundations of Confucianism". The third source also refers to China's economic soft power and is realized through three tools: "energy exchanges", "foreign investments" and "Belt-Road initiative". The research method of this article is descriptive-analytical.Keywords: soft power, cooperative power, china, west asia
Procedia PDF Downloads 5827149 Interfacing and Replication of Electronic Machinery Using MATLAB/SIMULINK
Authors: Abdulatif Abdulsalam, Mohamed Shaban
Abstract:
This paper introduces interfacing and replication of electronic tools based on the MATLAB/ SIMULINK mock-up package. Mock-up components contain dc-dc converters, power issue rectifiers, motivation machines, dc gear, synchronous gear, and more entire systems. Power issue rectifier model includes solid state device models. The tools are the clear-cut structure and mock-up of complex energetic systems connecting with power electronic machines.Keywords: power electronics, machine, MATLAB, simulink
Procedia PDF Downloads 35727148 TACTICAL: Ram Image Retrieval in Linux Using Protected Mode Architecture’s Paging Technique
Authors: Sedat Aktas, Egemen Ulusoy, Remzi Yildirim
Abstract:
This article explains how to get a ram image from a computer with a Linux operating system and what steps should be followed while getting it. What we mean by taking a ram image is the process of dumping the physical memory instantly and writing it to a file. This process can be likened to taking a picture of everything in the computer’s memory at that moment. This process is very important for tools that analyze ram images. Volatility can be given as an example because before these tools can analyze ram, images must be taken. These tools are used extensively in the forensic world. Forensic, on the other hand, is a set of processes for digitally examining the information on any computer or server on behalf of official authorities. In this article, the protected mode architecture in the Linux operating system is examined, and the way to save the image sample of the kernel driver and system memory to disk is followed. Tables and access methods to be used in the operating system are examined based on the basic architecture of the operating system, and the most appropriate methods and application methods are transferred to the article. Since there is no article directly related to this study on Linux in the literature, it is aimed to contribute to the literature with this study on obtaining ram images. LIME can be mentioned as a similar tool, but there is no explanation about the memory dumping method of this tool. Considering the frequency of use of these tools, the contribution of the study in the field of forensic medicine has been the main motivation of the study due to the intense studies on ram image in the field of forensics.Keywords: linux, paging, addressing, ram-image, memory dumping, kernel modules, forensic
Procedia PDF Downloads 11727147 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33427146 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt
Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer
Abstract:
Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening
Procedia PDF Downloads 5227145 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression
Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras
Abstract:
In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression
Procedia PDF Downloads 12027144 The Capabilities of New Communication Devices in Development of Informing: Case Study Mobile Functions in Iran
Authors: Mohsen Shakerinejad
Abstract:
Due to the growing momentum of technology, the present age is called age of communication and information. And With Astounding progress of Communication and information tools, current world Is likened to the "global village". That a message can be sent from one point to another point of the world in a Time scale Less than a minute. However, one of the new sociologists -Alain Touraine- in describing the destructive effects of new changes arising from the development of information appliances refers to the "new fields for undemocratic social control And the incidence of acute and unrest social and political tensions", Yet, in this era That With the advancement of the industry, the life of people has been industrial too, quickly and accurately Data Transfer, Causes Blowing new life in the Body of Society And according to the features of each society and the progress of science and technology, Various tools should be used. One of these communication tools is Mobile. Cellular phone As Communication and telecommunication revolution in recent years, Has had a great influence on the individual and collective life of societies. This powerful communication tool Have had an Undeniable effect, On all aspects of life, including social, economic, cultural, scientific, etc. so that Ignoring It in Design, Implementation and enforcement of any system is not wise. Nowadays knowledge and information are one of the most important aspects of human life. Therefore, in this article, it has been tried to introduce mobile potentials in receive and transmit News and Information. As it follows, among the numerous capabilities of current mobile phones features such as sending text, photography, sound recording, filming, and Internet connectivity could indicate the potential of this medium of communication in the process of sending and receiving information. So that nowadays, mobile journalism as an important component of citizen journalism Has a unique role in information dissemination.Keywords: mobile, informing, receiving information, mobile journalism, citizen journalism
Procedia PDF Downloads 41027143 Application of Digital Technologies as Tools for Transformative Agricultural Science Instructional Delivery in Secondary Schools
Authors: Cajethan U. Ugwuoke
Abstract:
Agriculture is taught in secondary schools to develop skills in students which will empower them to contribute to national economic development. Unfortunately, our educational system emphasizes the application of conventional teaching methods in delivering instructions, which fails to produce students competent enough to carry out agricultural production. This study was therefore aimed at examining the application of digital technologies as tools for transformative instructional delivery. Four specific purposes, research questions and hypotheses guided the study. The study adopted a descriptive survey research design where 80 subjects representing 64 teachers of agriculture and 16 principals in the Udenu local government area of Enugu State, Nigeria, participated in the study. A structured questionnaire was used to collect data. The assumption of normality was ascertained by subjecting the data collected to a normality test. Data collected were later subjected to mean, Pearson product-moment correlation, ANOVA and t-test to answer the research questions and test the hypotheses at a 5% significant level. The result shows that the application of digital technologies helps to reduce learners’ boredom (3.52.75), improves learners’ performance (3.63.51), and is used as a visual aid for learners (3.56.61), among others. There was a positive, strong and significant relationship between the application of digital technologies and effective instructional delivery (+.895, p=.001<.05, F=17.73), competency of teachers to the application of digital technologies and effective instructional delivery (+998, p=.001<0.5, F=16263.45), and frequency of the application of digital technologies and effective instructional delivery (+.999, p=.001<.05, F=31436.14). There was no evidence of autocorrelation and multicollinearity in the regression models between the application of digital technologies and effective instructional delivery (2.03, Tolerance=1.00, VIF=1.00), competency of teachers in the application of digital technologies and effective instructional delivery (2.38, Tolerance=1.00, VIF=1.00) and frequency of the application of digital technologies and effective instructional delivery (2.00, Tolerance=1.00, VIF=1.00). Digital technologies should be therefore applied in teaching to facilitate effective instructional delivery in agriculture.Keywords: agricultural science, digital technologies, instructional delivery, learning
Procedia PDF Downloads 7227142 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 10627141 Social Networking Application: What Is Their Quality and How Can They Be Adopted in Open Distance Learning Environments?
Authors: Asteria Nsamba
Abstract:
Social networking applications and tools have become compelling platforms for generating and sharing knowledge across the world. Social networking applications and tools refer to a variety of social media platforms which include Facebook, Twitter WhatsApp, blogs and Wikis. The most popular of these platforms are Facebook, with 2.41 billion active users on a monthly basis, followed by WhatsApp with 1.6 billion users and Twitter with 330 million users. These communication platforms have not only impacted social lives but have also impacted students’ learning, across different delivery modes in higher education: distance, conventional and blended learning modes. With this amount of interest in these platforms, knowledge sharing has gained importance within the context in which it is required. In open distance learning (ODL) contexts, social networking platforms can offer students and teachers the platform on which to create and share knowledge, and form learning collaborations. Thus, they can serve as support mechanisms to increase interactions and reduce isolation and loneliness inherent in ODL. Despite this potential and opportunity, research indicates that many ODL teachers are not inclined to using social media tools in learning. Although it is unclear why these tools are uncommon in these environments, concerns raised in the literature have indicated that many teachers have not mastered the art of teaching with technology. Using technological, pedagogical content knowledge (TPCK) and product quality theory, and Bloom’s Taxonomy as lenses, this paper is aimed at; firstly, assessing the quality of three social media applications: Facebook, Twitter and WhatsApp, in order to determine the extent to which they are suitable platforms for teaching and learning, in terms of content generation, information sharing and learning collaborations. Secondly, the paper demonstrates the application of teaching, learning and assessment using Bloom’s Taxonomy.Keywords: distance education, quality, social networking tools, TPACK
Procedia PDF Downloads 12427140 Predicting and Obtaining New Solvates of Curcumin, Demethoxycurcumin and Bisdemethoxycurcumin Based on the Ccdc Statistical Tools and Hansen Solubility Parameters
Authors: J. Ticona Chambi, E. A. De Almeida, C. A. Andrade Raymundo Gaiotto, A. M. Do Espírito Santo, L. Infantes, S. L. Cuffini
Abstract:
The solubility of active pharmaceutical ingredients (APIs) is challenging for the pharmaceutical industry. The new multicomponent crystalline forms as cocrystal and solvates present an opportunity to improve the solubility of APIs. Commonly, the procedure to obtain multicomponent crystalline forms of a drug starts by screening the drug molecule with the different coformers/solvents. However, it is necessary to develop methods to obtain multicomponent forms in an efficient way and with the least possible environmental impact. The Hansen Solubility Parameters (HSPs) is considered a tool to obtain theoretical knowledge of the solubility of the target compound in the chosen solvent. H-Bond Propensity (HBP), Molecular Complementarity (MC), Coordination Values (CV) are tools used for statistical prediction of cocrystals developed by the Cambridge Crystallographic Data Center (CCDC). The HSPs and the CCDC tools are based on inter- and intra-molecular interactions. The curcumin (Cur), target molecule, is commonly used as an anti‐inflammatory. The demethoxycurcumin (Demcur) and bisdemethoxycurcumin (Bisdcur) are natural analogues of Cur from turmeric. Those target molecules have differences in their solubilities. In this way, the work aimed to analyze and compare different tools for multicomponent forms prediction (solvates) of Cur, Demcur and Biscur. The HSP values were calculated for Cur, Demcur, and Biscur using the chemical group contribution methods and the statistical optimization from experimental data. The HSPmol software was used. From the HSPs of the target molecules and fifty solvents (listed in the HSP books), the relative energy difference (RED) was determined. The probability of the target molecules would be interacting with the solvent molecule was determined using the CCDC tools. A dataset of fifty molecules of different organic solvents was ranked for each prediction method and by a consensus ranking of different combinations: HSP, CV, HBP and MC values. Based on the prediction, 15 solvents were selected as Dimethyl Sulfoxide (DMSO), Tetrahydrofuran (THF), Acetonitrile (ACN), 1,4-Dioxane (DOX) and others. In a starting analysis, the slow evaporation technique from 50°C at room temperature and 4°C was used to obtain solvates. The single crystals were collected by using a Bruker D8 Venture diffractometer, detector Photon100. The data processing and crystal structure determination were performed using APEX3 and Olex2-1.5 software. According to the results, the HSPs (theoretical and optimized) and the Hansen solubility sphere for Cur, Demcur and Biscur were obtained. With respect to prediction analyses, a way to evaluate the predicting method was through the ranking and the consensus ranking position of solvates already reported in the literature. It was observed that the combination of HSP-CV obtained the best results when compared to the other methods. Furthermore, as a result of solvent selected, six new solvates, Cur-DOX, Cur-DMSO, Bicur-DOX, Bircur-THF, Demcur-DOX, Demcur-ACN and a new Biscur hydrate, were obtained. Crystal structures were determined for Cur-DOX, Biscur-DOX, Demcur-DOX and Bicur-Water. Moreover, the unit-cell parameter information for Cur-DMSO, Biscur-THF and Demcur-ACN were obtained. The preliminary results showed that the prediction method is showing a promising strategy to evaluate the possibility of forming multicomponent. It is currently working on obtaining multicomponent single crystals.Keywords: curcumin, HSPs, prediction, solvates, solubility
Procedia PDF Downloads 6327139 LORA: A Learning Outcome Modelling Approach for Higher Education
Authors: Aqeel Zeid, Hasna Anees, Mohamed Adheeb, Mohamed Rifan, Kalpani Manathunga
Abstract:
To achieve constructive alignment in a higher education program, a clear set of learning outcomes must be defined. Traditional learning outcome definition techniques such as Bloom’s taxonomy are not written to be utilized by the student. This might be disadvantageous for students in student-centric learning settings where the students are expected to formulate their own learning strategies. To solve the problem, we propose the learning outcome relation and aggregation (LORA) model. To achieve alignment, we developed learning outcome, assessment, and resource authoring tools which help teachers to tag learning outcomes during creation. A pilot study was conducted with an expert panel consisting of experienced professionals in the education domain to evaluate whether the LORA model and tools present an improvement over the traditional methods. The panel unanimously agreed that the model and tools are beneficial and effective. Moreover, it helped them model learning outcomes in a more student centric and descriptive way.Keywords: learning design, constructive alignment, Bloom’s taxonomy, learning outcome modelling
Procedia PDF Downloads 18627138 Processing Big Data: An Approach Using Feature Selection
Authors: Nikat Parveen, M. Ananthi
Abstract:
Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.Keywords: big data, key value, feature selection, retrieval, performance
Procedia PDF Downloads 34127137 [Keynote Talk]: From Clinical Practice to Academic Setup, 'Quality Circles' for Quality Outputs in Both
Authors: Vandita Mishra
Abstract:
From the management of patients, reception, record, and assistants in a clinical practice; to the management of ongoing research, clinical cases and department profile in an academic setup, the healthcare provider has to deal with all of it. The victory lies in smooth running of the show in both the above situations with an apt solution of problems encountered and smooth management of crisis faced. Thus this paper amalgamates dental science with health administration by means of introduction of a concept for practice management and problem-solving called 'Quality Circles'. This concept uses various tools for problem solving given by experts from different fields. QC tools can be applied in both clinical and academic settings in dentistry for better productivity and for scientifically approaching the process of continuous improvement in both the categories. When approached through QC, our organization showed better patient outcomes and more patient satisfaction. Introduced in 1962 by Kaoru Ishikawa, this tool has been extensively applied in certain fields outside dentistry and healthcare. By exemplification of some clinical cases and virtual scenarios, the tools of Quality circles will be elaborated and discussed upon.Keywords: academics, dentistry, healthcare, quality
Procedia PDF Downloads 10127136 New Tools and New Ways; Changing the Nature of Leadership and Future Challenges
Authors: Harun Ozdemirci
Abstract:
Complexity and chaos are the characteristics of our new world today. Either business or governmental sector, inner and outer environment changes in all aspects. To ensure leaders to guide organizations accurately and effectively, leaders also must change their attitudes towards this changing world . We need new tools, new mindsets and new views for new century. Every leader have to operate within an cerative and innovative way of thinking. But how it will occur and at which direction it will be managed or directed? What kind of abilities and attitudes make leader compatible with this ever-changing and ambigous environment? Leader who will lead in the future must have some special skillls. But how can we develop these skills and behaviours? What must be the mindset of a future leader? This paper searchs for answers of some of these questions. But asking questions is more important than giving answers to them. Innovation and creativity have been at the centerpiece of our lives for some years. But we don’t know how to manage and how to tackle with the challenges come up with this new situation. This new world order compel us to take some new positions against new employees who have different types of lives and habits, new productivity processes, new adversaries… Future environment will not be the same as we experience before. So, our responses to this new environment can not be the same as our predecessors gave. We have to innovate new ways of thinking, and new tools for solving new type of problems.Keywords: innovation, creativity, leader, future, liberal arts
Procedia PDF Downloads 27227135 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality
Authors: Sirilak Areerachakul
Abstract:
Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.Keywords: artificial neural network, geographic information system, water quality, computer science
Procedia PDF Downloads 34327134 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29827133 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 45727132 Automatic Tagging and Accuracy in Assamese Text Data
Authors: Chayanika Hazarika Bordoloi
Abstract:
This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.Keywords: CRF, morphology, tagging, tagset
Procedia PDF Downloads 19427131 Product Feature Modelling for Integrating Product Design and Assembly Process Planning
Authors: Baha Hasan, Jan Wikander
Abstract:
This paper describes a part of the integrating work between assembly design and assembly process planning domains (APP). The work is based, in its first stage, on modelling assembly features to support APP. A multi-layer architecture, based on feature-based modelling, is proposed to establish a dynamic and adaptable link between product design using CAD tools and APP. The proposed approach is based on deriving “specific function” features from the “generic” assembly and form features extracted from the CAD tools. A hierarchal structure from “generic” to “specific” and from “high level geometrical entities” to “low level geometrical entities” is proposed in order to integrate geometrical and assembly data extracted from geometrical and assembly modelers to the required processes and resources in APP. The feature concept, feature-based modelling, and feature recognition techniques are reviewed.Keywords: assembly feature, assembly process planning, feature, feature-based modelling, form feature, ontology
Procedia PDF Downloads 30927130 Global Experiences in Dealing with Biological Epidemics with an Emphasis on COVID-19 Disease: Approaches and Strategies
Authors: Marziye Hadian, Alireza Jabbari
Abstract:
Background: The World Health Organization has identified COVID-19 as a public health emergency and is urging governments to stop the virus transmission by adopting appropriate policies. In this regard, authorities have taken different approaches to cut the chain or controlling the spread of the disease. Now, the questions we are facing include what these approaches are? What tools should be used to implement each preventive protocol? In addition, what is the impact of each approach? Objective: The aim of this study was to determine the approaches to biological epidemics and related prevention tools with an emphasis on COVID-19 disease. Data sources: Databases including ISI web of science, PubMed, Scopus, Science Direct, Ovid, and ProQuest were employed for data extraction. Furthermore, authentic sources such as the WHO website, the published reports of relevant countries, as well as the Worldometer website were evaluated for gray studies. The time-frame of the study was from 1 December 2019 to 30 May 2020. Methods: The present study was a systematic study of publications related to the prevention strategies for the COVID-19 disease. The study was carried out based on the PRISMA guidelines and CASP for articles and AACODS for grey literature. Results: The study findings showed that in order to confront the COVID-19 epidemic, in general, there are three approaches of "mitigation", "active control" and "suppression" and four strategies of "quarantine", "isolation", "social distance" and "lockdown" in both individual and social dimensions to deal with epidemics. Selection and implementation of each approach requires specific strategies and has different effects when it comes to controlling and inhibiting the disease. Key finding: One possible approach to control the disease is to change individual behavior and lifestyle. In addition to prevention strategies, use of masks, observance of personal hygiene principles such as regular hand washing and non-contact of contaminated hands with the face, as well as an observance of public health principles such as sneezing and coughing etiquettes, safe extermination of personal protective equipment, must be strictly observed. Have not been included in the category of prevention tools. However, it has a great impact on controlling the epidemic, especially the new coronavirus epidemic. Conclusion: Although the use of different approaches to control and inhibit biological epidemics depends on numerous variables, however, despite these requirements, global experience suggests that some of these approaches are ineffective. The use of previous experiences in the world, along with the current experiences of countries, can be very helpful in choosing the accurate approach for each country in accordance with the characteristics of that country and lead to the reduction of possible costs at the national and international levels.Keywords: novel corona virus, COVID-19, approaches, prevention tools, prevention strategies
Procedia PDF Downloads 12627129 Toward Automatic Chest CT Image Segmentation
Authors: Angely Sim Jia Wun, Sasa Arsovski
Abstract:
Numerous studies have been conducted on the segmentation of medical images. Segmenting the lungs is one of the common research topics in those studies. Our research stemmed from the lack of solutions for automatic bone, airway, and vessel segmentation, despite the existence of multiple lung segmentation techniques. Consequently, currently, available software tools used for medical image segmentation do not provide automatic lung, bone, airway, and vessel segmentation. This paper presents segmentation techniques along with an interactive software tool architecture for segmenting bone, lung, airway, and vessel tissues. Additionally, we propose a method for creating binary masks from automatically generated segments. The key contribution of our approach is the technique for automatic image thresholding using adjustable Hounsfield values and binary mask extraction. Generated binary masks can be successfully used as a training dataset for deep-learning solutions in medical image segmentation. In this paper, we also examine the current software tools used for medical image segmentation, discuss our approach, and identify its advantages.Keywords: lung segmentation, binary masks, U-Net, medical software tools
Procedia PDF Downloads 98