Search results for: mixed method research
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38684

Search results for: mixed method research

31454 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications

Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani

Abstract:

This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.

Keywords: human activity detection, media pipe, machine learning, metaverse applications

Procedia PDF Downloads 159
31453 Psycholinguistic Analysis on Stuttering Treatment through Systemic Functional Grammar in Tom Hooper’s The King’s Speech

Authors: Nurvita Wijayanti

Abstract:

The movie titled The King’s Speech is based on a true story telling an English king suffers from stuttering and how he gets the treatment from the therapist, so that he can reduce the high frequency on stuttering. The treatment uses the unique approach implying the linguistic principles. This study shows how the language works significantly in order to treat the stuttering sufferer using psychological approach. Therefore, the linguistic study is done to analyze the treatment activity. Halliday’s Systemic Functional Grammar is used as the main approach in this study along with qualitative descriptive method. The study finds that the therapist though using the orthodox approach applies the psycholinguistic method to overcome the king’s stuttering.

Keywords: psycholinguistics, stuttering, systemic functional grammar, treatment

Procedia PDF Downloads 235
31452 The Mediation Effect of Customer Satisfaction in the Relationship between Service Quality, Corporate Image to Customer Loyalty

Authors: Rizwan Ali, Hammad Zafar

Abstract:

The purpose of this research is to investigate the mediation effect of customer satisfaction in the relationship between service quality, corporate image to customer loyalty, in Pakistan banking sector. The population of this research is banking customers and sample size of 210 respondents. This research uses the SPSS, Correlation, ANOVA and regression analysis techniques along with AMOS methods. The service quality and corporate image applied by the banks are not all variables can directly affect customer loyalty, but must first going through satisfaction. Which means that banks must first need to understand what the customer basic needs through variable service quality and corporate image so that the customers feel loyal when the level of satisfaction is resolved. The service quality provided by the banking industry needs to be improved in order to improve customer satisfaction and loyalty of banking services, especially for banks in Pakistan.

Keywords: customer loyalty, service quality, corporate image, customer satisfaction

Procedia PDF Downloads 89
31451 A Mixed Expert Evaluation System and Dynamic Interval-Valued Hesitant Fuzzy Selection Approach

Authors: Hossein Gitinavard, Mohammad Hossein Fazel Zarandi

Abstract:

In the last decades, concerns about the environmental issues lead to professional and academic efforts on green supplier selection problems. In this sake, one of the main issues in evaluating the green supplier selection problems, which could increase the uncertainty, is the preferences of the experts' judgments about the candidate green suppliers. Therefore, preparing an expert system to evaluate the problem based on the historical data and the experts' knowledge can be sensible. This study provides an expert evaluation system to assess the candidate green suppliers under selected criteria in a multi-period approach. In addition, a ranking approach under interval-valued hesitant fuzzy set (IVHFS) environment is proposed to select the most appropriate green supplier in planning horizon. In the proposed ranking approach, the IVHFS and the last aggregation approach are considered to margin the errors and to prevent data loss, respectively. Hence, a comparative analysis is provided based on an illustrative example to show the feasibility of the proposed approach.

Keywords: green supplier selection, expert system, ranking approach, interval-valued hesitant fuzzy setting

Procedia PDF Downloads 313
31450 Direct Approach in Modeling Particle Breakage Using Discrete Element Method

Authors: Ebrahim Ghasemi Ardi, Ai Bing Yu, Run Yu Yang

Abstract:

Current study is aimed to develop an available in-house discrete element method (DEM) code and link it with direct breakage event. So, it became possible to determine the particle breakage and then its fragments size distribution, simultaneous with DEM simulation. It directly applies the particle breakage inside the DEM computation algorithm and if any breakage happens the original particle is replaced with daughters. In this way, the calculation will be followed based on a new updated particles list which is very similar to the real grinding environment. To validate developed model, a grinding ball impacting an unconfined particle bed was simulated. Since considering an entire ball mill would be too computationally demanding, this method provided a simplified environment to test the model. Accordingly, a representative volume of the ball mill was simulated inside a box, which could emulate media (ball)–powder bed impacts in a ball mill and during particle bed impact tests. Mono, binary and ternary particle beds were simulated to determine the effects of granular composition on breakage kinetics. The results obtained from the DEM simulations showed a reduction in the specific breakage rate for coarse particles in binary mixtures. The origin of this phenomenon, commonly known as cushioning or decelerated breakage in dry milling processes, was explained by the DEM simulations. Fine particles in a particle bed increase mechanical energy loss, and reduce and distribute interparticle forces thereby inhibiting the breakage of the coarse component. On the other hand, the specific breakage rate of fine particles increased due to contacts associated with coarse particles. Such phenomenon, known as acceleration, was shown to be less significant, but should be considered in future attempts to accurately quantify non-linear breakage kinetics in the modeling of dry milling processes.

Keywords: particle bed, breakage models, breakage kinetic, discrete element method

Procedia PDF Downloads 187
31449 Impact on the Yield of Flavonoid and Total Phenolic Content from Pomegranate Fruit by Different Extraction Methods

Authors: Udeshika Yapa Bandara, Chamindri Witharana, Preethi Soysa

Abstract:

Pomegranate fruits are used in cancer treatment in Ayurveda, Sri Lanka. Due to prevailing therapeutic effects of phytochemicals, this study was focus on anti-cancer properties of the constituents in the parts of Pomegranate fruit. Furthermore, the method of extraction, plays a crucial step of the phytochemical analysis. Therefore, this study was focus on different extraction methods. Five techniques were involved for the peel and the pericarp to evaluate the most effective extraction method; Boiling with electric burner (BL), Sonication (SN), Microwaving (MC), Heating in a 50°C water bath (WB) and Sonication followed by Microwaving (SN-MC). The presence of polyphenolic and flavonoid contents were evaluated to recognize the best extraction method for polyphenols. The total phenolic content was measured spectrophotometrically by Folin-Ciocalteu method and expressed as Gallic Acid Equivalents (w/w% GAE). Total flavonoid content was also determined spectrophotometrically with Aluminium chloride colourimetric assay and expressed as Quercetin Equivalents (w/w % QE). Pomegranate juice was taken as fermented juice (with Saccharomyces bayanus) and fresh juice. Powdered seeds were refluxed, filtered and freeze-dried. 2g of freeze-dried powder of each component was dissolved in 100ml of De-ionized water for extraction. For the comparison of antioxidant activity and total phenol content, the polyphenols were removed by the Polyvinylpolypyrrolidone (PVVP) column and fermented and fresh juice were tested for the 1, 1-diphenyl-2-picrylhydrazil (DPPH) radical scavenging activity, before and after the removal of polyphenols. For the peel samples of Pomegranate fruit, total phenol and flavonoid contents were high in Sonication (SN). In pericarp, total phenol and flavonoid contents were highly exhibited in method of Sonication (SN). A significant difference was observed (P< 0.05) in total phenol and flavonoid contents, between five extraction methods for both peel and pericarp samples. Fermented juice had a greatest polyphenolic and flavonoid contents comparative to fresh juice. After removing polyphenols of fermented juice and fresh juice using Polyvinyl polypyrrolidone (PVVP) column, low antioxidant activity was resulted for DPPH antioxidant activity assay. Seeds had a very low total phenol and flavonoid contents according to the results. Although, Pomegranate peel is the main waste component of the fruit, it has an excellent polyphenolic and flavonoid contents compared to other parts of the fruit, devoid of the method of extraction. Polyphenols play a major role for antioxidant activity.

Keywords: antioxidant activity, flavonoids, polyphenols, pomegranate

Procedia PDF Downloads 148
31448 Tumor Size and Lymph Node Metastasis Detection in Colon Cancer Patients Using MR Images

Authors: Mohammadreza Hedyehzadeh, Mahdi Yousefi

Abstract:

Colon cancer is one of the most common cancer, which predicted to increase its prevalence due to the bad eating habits of peoples. Nowadays, due to the busyness of people, the use of fast foods is increasing, and therefore, diagnosis of this disease and its treatment are of particular importance. To determine the best treatment approach for each specific colon cancer patients, the oncologist should be known the stage of the tumor. The most common method to determine the tumor stage is TNM staging system. In this system, M indicates the presence of metastasis, N indicates the extent of spread to the lymph nodes, and T indicates the size of the tumor. It is clear that in order to determine all three of these parameters, an imaging method must be used, and the gold standard imaging protocols for this purpose are CT and PET/CT. In CT imaging, due to the use of X-rays, the risk of cancer and the absorbed dose of the patient is high, while in the PET/CT method, there is a lack of access to the device due to its high cost. Therefore, in this study, we aimed to estimate the tumor size and the extent of its spread to the lymph nodes using MR images. More than 1300 MR images collected from the TCIA portal, and in the first step (pre-processing), histogram equalization to improve image qualities and resizing to get the same image size was done. Two expert radiologists, which work more than 21 years on colon cancer cases, segmented the images and extracted the tumor region from the images. The next step is feature extraction from segmented images and then classify the data into three classes: T0N0، T3N1 و T3N2. In this article, the VGG-16 convolutional neural network has been used to perform both of the above-mentioned tasks, i.e., feature extraction and classification. This network has 13 convolution layers for feature extraction and three fully connected layers with the softmax activation function for classification. In order to validate the proposed method, the 10-fold cross validation method used in such a way that the data was randomly divided into three parts: training (70% of data), validation (10% of data) and the rest for testing. It is repeated 10 times, each time, the accuracy, sensitivity and specificity of the model are calculated and the average of ten repetitions is reported as the result. The accuracy, specificity and sensitivity of the proposed method for testing dataset was 89/09%, 95/8% and 96/4%. Compared to previous studies, using a safe imaging technique (MRI) and non-use of predefined hand-crafted imaging features to determine the stage of colon cancer patients are some of the study advantages.

Keywords: colon cancer, VGG-16, magnetic resonance imaging, tumor size, lymph node metastasis

Procedia PDF Downloads 43
31447 Using the ISO 9705 Room Corner Test for Smoke Toxicity Quantification of Polyurethane

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

Polyurethane (PU) foam is typically sold as acoustic foam that is often used as sound insulation in settings such as night clubs and bars. As a construction product, PU is tested by being glued to the walls and ceiling of the ISO 9705 room corner test room. However, when heat is applied to PU foam, it melts and burns as a pool fire due to it being a thermoplastic. The current test layout is unable to accurately measure mass loss and doesn’t allow for the material to burn as a pool fire without seeping out of the test room floor. The lack of mass loss measurement means gas yields pertaining to smoke toxicity analysis can’t be calculated, which makes data comparisons from any other material or test method difficult. Additionally, the heat release measurements are not representative of the actual measurements taken as a lot of the material seeps through the floor (when a tray to catch the melted material is not used). This research aimed to modify the ISO 9705 test to provide the ability to measure mass loss to allow for better calculation of gas yields and understanding of decomposition. It also aimed to accurately measure smoke toxicity in both the doorway and duct and enable dilution factors to be calculated. Finally, the study aimed to examine if doubling the fuel loading would force under-ventilated flaming. The test layout was modified to be a combination of the SBI (single burning item) test set up inside oof the ISO 9705 test room. Polyurethane was tested in two different ways with the aim of altering the ventilation condition of the tests. Test one was conducted using 1 x SBI test rig aiming for well-ventilated flaming. Test two was conducted using 2 x SBI rigs (facing each other inside the test room) (doubling the fuel loading) aiming for under-ventilated flaming. The two different configurations used were successful in achieving both well-ventilated flaming and under-ventilated flaming, shown by the measured equivalence ratios (measured using a phi meter designed and created for these experiments). The findings show that doubling the fuel loading will successfully force under-ventilated flaming conditions to be achieved. This method can therefore be used when trying to replicate post-flashover conditions in future ISO 9705 room corner tests. The radiative heat generated by the two SBI rigs facing each other facilitated a much higher overall heat release resulting in a more severe fire. The method successfully allowed for accurate measurement of smoke toxicity produced from the PU foam in terms of simple gases such as oxygen depletion, CO and CO2. Overall, the proposed test modifications improve the ability to measure the smoke toxicity of materials in different fire conditions on a large-scale.

Keywords: flammability, ISO9705, large-scale testing, polyurethane, smoke toxicity

Procedia PDF Downloads 60
31446 Effectiveness of Selected Anthementics on Nematode Parasites of Sheep in KwaZulu-Natal, South Africa

Authors: M. A. Ahmed, N. Basha, I. V. Nsahlai

Abstract:

This study determined the effectiveness of selected anthementics (Ivermectin 1% (IVM), Closantel 7.5% (CST) and a combination Abamectin 0.08% and Praziquantel 1.5% (CAP) currently being used in SA. Gender, initial egg per gram (EPG) and initial live weight aided in blocking animals into groups, each group was randomly treated with one of four drug treatments comprising: the untreated control (D0), IVM, CST, and CAP. Animals grazed throughout on infested pasture. Rectal faeces were collected on days 0, 7, 14, and 21 for determining EPG. Faeces were mixed per group and incubated to identify and determine the abundance of larval forms of Haemonchus, Trichostrongylus, Strongyloides, Namatodirus, and Cooperia species. Differences between treatments changed over time. On day7 IVM, CST, and CAP depressed EPG to 0.66, 0.37 and 0.80 of their respective starting values whilst EPG increased 1.39 times for D0. Thereafter, EPG increased consistently for all drugs; CST recorded the lowest values. Haemonchus, Trichostrongylus, Strongyloides, Namatodirus and Coperia species contributed respectively 60%, 30%, 6%, 3%, and 1% of the larval forms on day 0; and 78%, 8%, 11%, 1%, 2% on day 21. Larval forms increased for Haemonchus species but decreased for Trichostrongylus species over time. Closantel was the most effective dewormer. Haemonchus Spp. were least affected whilst Trichostrongylus Spp. were the most affect by all drugs.

Keywords: anthementics, faecal egg count, L3 larvae, sheep

Procedia PDF Downloads 489
31445 Lightweight Hardware Firewall for Embedded System Based on Bus Transactions

Authors: Ziyuan Wu, Yulong Jia, Xiang Zhang, Wanting Zhou, Lei Li

Abstract:

The Internet of Things (IoT) is a rapidly evolving field involving a large number of interconnected embedded devices. In the design of embedded System-on-Chip (SoC), the key issues are power consumption, performance, and security. However, the easy-to-implement software and untrustworthy third-party IP cores may threaten the safety of hardware assets. Considering that illegal access and malicious attacks against SoC resources pass through the bus that integrates IPs, we propose a Lightweight Hardware Firewall (LHF) to protect SoC, which monitors and disallows the offending bus transactions based on physical addresses. Furthermore, under the LHF architecture, this paper refines two types of firewalls: Destination Hardware Firewall (DHF) and Source Hardware Firewall (SHF). The former is oriented to fine-grained detection and configuration, whose core technology is based on the method of dynamic grading units. In addition, we design the SHF based on static entries to achieve lightweight. Finally, we evaluate the hardware consumption of the proposed method by both Field-Programmable Gate Array (FPGA) and IC. Compared with the exciting efforts, LHF introduces a bus latency of zero clock cycles for every read or write transaction implemented on Xilinx Kintex-7 FPGAs. Meanwhile, the DC synthesis results based on TSMC 90nm show that the area is reduced by about 25% compared with the previous method.

Keywords: IoT, security, SoC, bus architecture, lightweight hardware firewall, FPGA

Procedia PDF Downloads 47
31444 Bibliometric Analysis of the Impact of Funding on Scientific Development of Researchers

Authors: Ashkan Ebadi, Andrea Schiffauerova

Abstract:

Every year, a considerable amount of money is being invested on research, mainly in the form of funding allocated to universities and research institutes. To better distribute the available funds and to set the most proper R&D investment strategies for the future, evaluation of the productivity of the funded researchers and the impact of such funding is crucial. In this paper, using the data on 15 years of journal publications of the NSERC (Natural Sciences and Engineering research Council of Canada) funded researchers and by means of bibliometric analysis, the scientific development of the funded researchers and their scientific collaboration patterns will be investigated in the period of 1996-2010. According to the results it seems that there is a positive relation between the average level of funding and quantity and quality of the scientific output. In addition, whenever funding allocated to the researchers has increased, the number of co-authors per paper has also augmented. Hence, the increase in the level of funding may enable researchers to get involved in larger projects and/or scientific teams and increase their scientific output respectively.

Keywords: bibliometrics, collaboration, funding, productivity

Procedia PDF Downloads 269
31443 Infrastructure Investment Law Formulation to Ensure Low Transaction Cost at Policy Level: Case Study of Public Private Partnership Project at the Ministry of Public Works and Housing of the Republic of Indonesia

Authors: Yolanda Indah Permatasari, Sudarsono Hardjosoekarto

Abstract:

Public private partnership (PPP) scheme was considered as an alternative source of funding for infrastructure provision. However, the performance of PPP scheme and interest of private sector to participate in the provision of infrastructure was still practically low. This phenomenon motivates the research to reconstruct the form of collaborative governance at the policy level from the perspective of transaction cost of the PPP scheme. Soft-system methodology (SSM)-based action research was used as this research methodology. The result of this study concludes that the emergence of transaction cost sources at the policy level is caused by the absence of a law that governs infrastructure investment, especially the implementation of PPP scheme. This absence is causing the imbalance in risk allocation and risk mitigation between the public and private sector. Thus, this research recommended the formulation of infrastructure investment law that aims to minimize asymmetry information, to anticipate the principal-principal problems, and to provide legal basis that ensures risk certainty and guarantee fair risk allocation between public and private sector.

Keywords: public governance, public private partnership, soft system methodology, transaction cost

Procedia PDF Downloads 124
31442 Optimisation of Stored Alcoholic Beverage Joufinai with Reverse Phase HPLC Method and Its Antioxidant Activities: North- East India

Authors: Dibakar Chandra Deka, Anamika Kalita Deka

Abstract:

Fermented alcoholic beverage production has its own stand among the tribal communities of North-East India. This biological oxidation method is followed by Ahom, Dimasa, Nishi, Miri, Bodo, Rabha tribes of this region. Bodo tribes among them not only prepare fermented alcoholic beverage but also store it for various time periods like 3 months, 6 months, 9 months, 12 months and 15 months etc. They prepare alcoholic beverage Jou (rice beer) following the fermentation of Oryza sativa with traditional yeast culture Amao. Saccharomyces cerevisiae is the main domain strain present in Amao. Dongphangrakep (Scoparia dulcis), Mwkhna (Clerodendrum viscosum), Thalir (Musa balbisina) and Khantal Bilai (Ananas cosmos) are the main plants used for Amao preparation. The stored Jou is known as Joufinai. They store the fermented mixture (rice and Amao) in anaerobic conditions for the preparation of Joufinai. We observed a successive increase in alcohol content from 3 months of storage period with 11.79 ± 0.010 (%, v/v) to 15.48 ± 0.070 (%, v/v) at 15 months of storage by a simple, reproducible and solution based colorimetric method. A positive linear correlation was also observed between pH and ethanol content with storage having correlation coefficient 0.981. Here, we optimised the detection of change in constituents of Joufinai during storage using reverse phase HPLC method. We found acetone, ethanol, acetic acid, glycerol as main constituents present in Joufinai. A very good correlation was observed from 3 months to 15 months of storage periods with its constituents. Increase in glycerol content was also detected with storage periods and hence Joufinai can be use as a precursor of above stated compounds. We also observed antioxidant activities increase from 0.056 ±2.80 mg/mL for 3 months old to 0.078± 5.33 mg/mL (in ascorbic acid equivalents) for 15 month old beverage by DPPH radical scavenging method. Therefore, we aimed for scientific validation of storage procedure used by Bodos in Joufinai production and to convert the Bodos’ traditional alcoholic beverage to a commercial commodity through our study.

Keywords: Amao, correlation, beverage, joufinai

Procedia PDF Downloads 301
31441 The System-Dynamic Model of Sustainable Development Based on the Energy Flow Analysis Approach

Authors: Inese Trusina, Elita Jermolajeva, Viktors Gopejenko, Viktor Abramov

Abstract:

Global challenges require a transition from the existing linear economic model to a model that will consider nature as a life support system for the development of the way to social well-being in the frame of the ecological economics paradigm. The objective of the article is to present the results of the analysis of socio-economic systems in the context of sustainable development using the systems power (energy flows) changes analyzing method and structural Kaldor's model of GDP. In accordance with the principles of life's development and the ecological concept was formalized the tasks of sustainable development of the open, non-equilibrium, stable socio-economic systems were formalized using the energy flows analysis method. The methodology of monitoring sustainable development and level of life were considered during the research of interactions in the system ‘human - society - nature’ and using the theory of a unified system of space-time measurements. Based on the results of the analysis, the time series consumption energy and economic structural model were formulated for the level, degree and tendencies of sustainable development of the system and formalized the conditions of growth, degrowth and stationarity. In order to design the future state of socio-economic systems, a concept was formulated, and the first models of energy flows in systems were created using the tools of system dynamics. During the research, the authors calculated and used a system of universal indicators of sustainable development in the invariant coordinate system in energy units. In order to design the future state of socio-economic systems, a concept was formulated, and the first models of energy flows in systems were created using the tools of system dynamics. In the context of the proposed approach and methods, universal sustainable development indicators were calculated as models of development for the USA and China. The calculations used data from the World Bank database for the period from 1960 to 2019. Main results: 1) In accordance with the proposed approach, the heterogeneous energy resources of countries were reduced to universal power units, summarized and expressed as a unified number. 2) The values of universal indicators of the life’s level were obtained and compared with generally accepted similar indicators.3) The system of indicators in accordance with the requirements of sustainable development can be considered as a basis for monitoring development trends. This work can make a significant contribution to overcoming the difficulties of forming socio-economic policy, which is largely due to the lack of information that allows one to have an idea of the course and trends of socio-economic processes. The existing methods for the monitoring of the change do not fully meet this requirement since indicators have different units of measurement from different areas and, as a rule, are the reaction of socio-economic systems to actions already taken and, moreover, with a time shift. Currently, the inconsistency or inconsistency of measures of heterogeneous social, economic, environmental, and other systems is the reason that social systems are managed in isolation from the general laws of living systems, which can ultimately lead to a systemic crisis.

Keywords: sustainability, system dynamic, power, energy flows, development

Procedia PDF Downloads 42
31440 Design of the Ubiquitous Cloud Learning Management System

Authors: Panita Wannapiroon, Noppadon Phumeechanya, Sitthichai Laisema

Abstract:

This study is the research and development which is intended to: 1) design the ubiquitous cloud learning management system and: 2) assess the suitability of the design of the ubiquitous cloud learning management system. Its methods are divided into 2 phases. Phase 1 is the design of the ubiquitous cloud learning management system, phase 2 is the assessment of the suitability of the design the samples used in this study are work done by 25 professionals in the field of Ubiquitous cloud learning management systems and information and communication technology in education selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. The results showed that the ubiquitous cloud learning management system consists of 2 main components which are: 1) the ubiquitous cloud learning management system server (u-Cloud LMS Server) including: cloud repository, cloud information resources, social cloud network, cloud context awareness, cloud communication, cloud collaborative tools, and: 2) the mobile client. The result of the system suitability assessment from the professionals is in the highest range.

Keywords: learning management system, cloud computing, ubiquitous learning, ubiquitous learning management system

Procedia PDF Downloads 504
31439 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning

Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana

Abstract:

Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.

Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning

Procedia PDF Downloads 18
31438 Possible Reasons for and Consequences of Generalizing Subgroup-Based Measurement Results to Populations: Based on Research Studies Conducted by Elementary Teachers in South Korea

Authors: Jaejun Jong

Abstract:

Many teachers in South Korea conduct research to improve the quality of their instruction. Unfortunately, many researchers generalize the results of measurements based on one subgroup to other students or to the entire population, which can cause problems. This study aims to determine examples of possible problems resulting from generalizing measurements based on one subgroup to an entire population or another group. This study is needed, as teachers’ instruction and class quality significantly affect the overall quality of education, but the quality of research conducted by teachers can become questionable due to overgeneralization. Thus, finding potential problems of overgeneralization can improve the overall quality of education. The data in this study were gathered from 145 sixth-grade elementary school students in South Korea. The result showed that students in different classes could differ significantly in various ways; thus, generalizing the results of subgroups to an entire population can engender erroneous student predictions and evaluations, which can lead to inappropriate instruction plans. This result shows that finding the reasons for such overgeneralization can significantly improve the quality of education.

Keywords: generalization, measurement, research methodology, teacher education

Procedia PDF Downloads 82
31437 Analysis of the Relationship between the Old Days Hospitalized with Economic Lost Top Ten Age Productive Disease in Hospital Inpatient Inche Abdul Moeis Samarinda, Indonesia

Authors: Tri Murti Tugiman, Awalyya Fasha

Abstract:

This research aims to analyze the magnitude of the economic losses incurred as a result of a person suffering from a particular disease of the ten highest in the productive age diseases in Hospitals Inche Abdul Moeis Samarinda. This research was a descriptive survey research and a secondary data analysis. For the analysis of economic losses populations used are all in patients who suffer from the 10 highest diseases in the productive age in hospitals IA Moeis Samarinda in 2011. Sampling was performed by using a stratified random sampling with samples of 77 people. Research results indicate that the direct cost community incurred to obtain medical services in hospitals IA Moeis is IDR 74437520. The amount of indirect costs incurred during service in a community hospital is IDR 10562000. The amount lost due to sickness fee is IDR 5377800. The amount of economic lost people to obtain medical services in hospitals IA Moeis is IDR 90377320. The number of days of hospitalization was as much as 171 respondents throughout the day. This study suggests the economic loss could be prevented by changes in the lifestyle of the people who clean and healthy along with the following insurance.

Keywords: hospitalized, economic lost, productive age diseases, secondary data analysis

Procedia PDF Downloads 462
31436 1D Velocity Model for the Gobi-Altai Region from Local Earthquakes

Authors: Dolgormaa Munkhbaatar, Munkhsaikhan Adiya, Tseedulam Khuut

Abstract:

We performed an inversion method to determine the 1D-velocity model with station corrections of the Gobi-Altai area in the southern part of Mongolia using earthquake data collected in the National Data Center during the last 10 years. In this study, the concept of the new 1D model has been employed to minimize the average RMS of a set of well-located earthquakes, recorded at permanent (between 2006 and 2016) and temporary seismic stations (between 2014 and 2016), compute solutions for the coupled hypocenter and 1D velocity model. We selected 4800 events with RMS less than 0.5 seconds and with a maximum GAP of 170 degrees and determined velocity structures. Also, we relocated all possible events located in the Gobi-Altai area using the new 1D velocity model and achieved constrained hypocentral determinations for events within this area. We concluded that the estimated new 1D velocity model is a relatively low range compared to the previous velocity model in a significant improvement intend to, and the quality of the information basis for future research center locations to determine the earthquake epicenter area with this new transmission model.

Keywords: 1D velocity model, earthquake, relocation, Velest

Procedia PDF Downloads 146
31435 Sentence vs. Keyword Content Analysis in Intellectual Capital Disclosures Study

Authors: Martin Surya Mulyadi, Yunita Anwar, Rosinta Ria Panggabean

Abstract:

Major transformations in economic activity from an agricultural economy to knowledge economy have led to an increasing focus on intellectual capital (IC) that has been characterized by continuous innovation, the spread of digital and communication technologies, intangible and human factors. IC is defined as the possession of knowledge and experience, professional knowledge and skill, proper relationships and technological capacities, which when applied will give organizations a competitive advantage. All of IC report/disclosure could be captured from the corporate annual report as it is a communication device that allows a corporation to connect with various external and internal stakeholders. This study was conducted using sentence-content analysis of IC disclosure in the annual report. This research aims to analyze whether the keyword-content analysis is reliable research methodology for IC disclosure related research.

Keywords: intellectual capital, intellectual capital disclosure, content analysis, annual report, sentence analysis, keyword analysis

Procedia PDF Downloads 347
31434 Explanation Conceptual Model of the Architectural Form Effect on Structures in Building Aesthetics

Authors: Fatemeh Nejati, Farah Habib, Sayeh Goudarzi

Abstract:

Architecture and structure have always been closely interrelated so that they should be integrated into a unified, coherent and beautiful universe, while in the contemporary era, both structures and architecture proceed separately. The purpose of architecture is the art of creating form and space and order for human service, and the goal of the structural engineer is the transfer of loads to the structure, too. This research seeks to achieve the goal by looking at the relationship between the form of architecture and structure from its inception to the present day to the Global Identification and Management Plan. Finally, by identifying the main components of the design of the structure in interaction with the architectural form, an effective step is conducted in the Professional training direction and solutions to professionals. Therefore, after reviewing the evolution of structural and architectural coordination in various historical periods as well as how to reach the form of the structure in different times and places, components are required to test the components and present the final theory that one hundred to be tested in this regard. Finally, this research indicates the fact that the form of architecture and structure has an aesthetic link, which is influenced by a number of components that could be edited and has a regular order throughout history that could be regular. The research methodology is analytic, and it is comparative using analytical and matrix diagrams and diagrams and tools for conducting library research and interviewing.

Keywords: architecture, structural form, structural and architectural coordination, effective components, aesthetics

Procedia PDF Downloads 201
31433 Unveiling the Indonesian Identity through Proverbial Expressions: The Relation of Meaning between Authority and Globalization

Authors: Prima Gusti Yanti, Fairul Zabadi

Abstract:

The purpose of the study is to find out relation of moral massage with the authority ang globalization in proverb. Proverb is one of the many forms of cultural identity of the Indonesian/Malay people fulled with moral values. The values contained within those proverbs are beneficial not only to the society, but also to those who held power amidst on this era of globalization. The method being used is qualitative research by using content analysis which is done by describing and uncovering the forms and meanings of proverbs used within Indonesia Minangkabau society. Sources for this study’s data were extracted from a Minangkabau native speaker in the subdistrict of Tanah Abang, Jakarta. Said sources were retrieved through a series of interviews with the Minangkabau native speaker, whose speech is still adorned with idiomatic expressions. The research findings show that there existed 30 proverbs or idiomatic expressions in the Minangkabau language that are often used by its indigenous people. The thirty data contain moral values that are closely interwoven with the matter of power and globalization. Analytical results show that there are fourteen moral values contained within proverbs reflect a firm connection between rule and power in globalization; such as: responsible, brave, togetherness and consensus,tolerance, politeness, thorough and meticulous,honest and keeping promise, ingenious and learning, care, self-correction, be fair, alert, arbitrary, self-awareness. Structurally, proverbs possess an unchangeably formal construction; symbolically, proverbs possess meanings that are clearly decided through ethnographic communicative factors along with situational and cultural contexts. Values contained within proverbs may be used as a guide in social management, be it between fellow men, men between nature, or even men between their Creator. Therefore, the meanings and values contained within the morals of proverbs could also be utilized as a counsel for those who rule and in charge of power in order to stem the tides of globalization that had already spread into sectoral, territorial and educational continuums.

Keywords: continuum, globalization, identity, proverb, rule-power

Procedia PDF Downloads 379
31432 A Method to Saturation Modeling of Synchronous Machines in d-q Axes

Authors: Mohamed Arbi Khlifi, Badr M. Alshammari

Abstract:

This paper discusses the general methods to saturation in the steady-state, two axis (d & q) frame models of synchronous machines. In particular, the important role of the magnetic coupling between the d-q axes (cross-magnetizing phenomenon), is demonstrated. For that purpose, distinct methods of saturation modeling of dumper synchronous machine with cross-saturation are identified, and detailed models synthesis in d-q axes. A number of models are given in the final developed form. The procedure and the novel models are verified by a critical application to prove the validity of the method and the equivalence between all developed models is reported. Advantages of some of the models over the existing ones and their applicability are discussed.

Keywords: cross-magnetizing, models synthesis, synchronous machine, saturated modeling, state-space vectors

Procedia PDF Downloads 437
31431 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 341
31430 Investment Adjustments to Exchange Rate Fluctuations Evidence from Manufacturing Firms in Tunisia

Authors: Mourad Zmami Oussema BenSalha

Abstract:

The current research aims to assess empirically the reaction of private investment to exchange rate fluctuations in Tunisia using a sample of 548 firms operating in manufacturing industries between 1997 and 2002. The micro-econometric model we estimate is based on an accelerator-profit specification investment model increased by two variables that measure the variation and the volatility of exchange rates. Estimates using the system the GMM method reveal that the effects of the exchange rate depreciation on investment are negative since it increases the cost of imported capital goods. Turning to the exchange rate volatility, as measured by the GARCH (1,1) model, our findings assign a significant role to the exchange rate uncertainty in explaining the sluggishness of private investment in Tunisia in the full sample of firms. Other estimation attempts based on various sub samples indicate that the elasticities of investment relative to the exchange rate volatility depend upon many firms’ specific characteristics such as the size and the ownership structure.

Keywords: investment, exchange rate volatility, manufacturing firms, system GMM, Tunisia

Procedia PDF Downloads 394
31429 A Fully Coupled Thermo-Hydraulic Mechanical Elastoplastic Damage Constitutive Model for Porous Fractured Medium during CO₂ Injection

Authors: Nikolaos Reppas, Yilin Gui

Abstract:

A dual-porosity finite element-code will be presented for the stability analysis of the wellbore during CO₂ injection. An elastoplastic damage response will be considered to the model. The Finite Element Method (FEM) will be validated using experimental results from literature or from experiments that are planned to be undertaken at Newcastle University. The main target of the research paper is to present a constitutive model that can help industries to safely store CO₂ in geological rock formations and forecast any changes on the surrounding rock of the wellbore. The fully coupled elastoplastic damage Thermo-Hydraulic-Mechanical (THM) model will determine the pressure and temperature of the injected CO₂ as well as the size of the radius of the wellbore that can make the Carbon Capture and Storage (CCS) procedure more efficient.

Keywords: carbon capture and storage, Wellbore stability, elastoplastic damage response for rock, constitutive THM model, fully coupled thermo-hydraulic-mechanical model

Procedia PDF Downloads 160
31428 Enhancing of Biogas Production from Slaughterhouse and Dairy Farm Waste with Pasteurization

Authors: Mahmoud Hassan Onsa, Saadelnour Abdueljabbar Adam

Abstract:

Wastes from slaughterhouses in most towns in Sudan are often poorly managed and sometimes discharged into adjoining streams due to poor implementation of standards, thus causing environmental and public health hazards and also there is a large amount of manure from dairy farms. This paper presents solution of organic waste from cow dairy farms and slaughterhouse the anaerobic digestion and biogas production. The paper presents the findings of experimental investigation of biogas production with and without pasteurization using cow manure, blood and rumen content were mixed at two proportions, 72.3% manure, 21.7%, rumen content and 6% blood for bio digester1with 62% dry matter at the beginning and without pasteurization and 72.3% manure, 21.7%, rumen content and 6% blood for bio-digester2 with 10% dry matter and pasteurization. The paper analyses the quantitative and qualitative composition of biogas: gas content, the concentration of methane. The highest biogas output 2.9 mL/g dry matter/day (from bio-digester2) together with a high quality biogas of 87.4% methane content which is useful for combustion and energy production and healthy bio-fertilizer but biodigester1 gave 1.68 mL/g dry matter/day with methane content 85% which is useful for combustion, energy production and can be considered as new technology of dryer bio-digesters.

Keywords: anaerobic digestion, bio-digester, blood, cow manure, rumen content

Procedia PDF Downloads 708
31427 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 382
31426 The Predictive Utility of Subjective Cognitive Decline Using Item Level Data from the Everyday Cognition (ECog) Scales

Authors: J. Fox, J. Randhawa, M. Chan, L. Campbell, A. Weakely, D. J. Harvey, S. Tomaszewski Farias

Abstract:

Early identification of individuals at risk for conversion to dementia provides an opportunity for preventative treatment. Many older adults (30-60%) report specific subjective cognitive decline (SCD); however, previous research is inconsistent in terms of what types of complaints predict future cognitive decline. The purpose of this study is to identify which specific complaints from the Everyday Cognition Scales (ECog) scales, a measure of self-reported concerns for everyday abilities across six cognitive domains, are associated with: 1) conversion from a clinical diagnosis of normal to either MCI or dementia (categorical variable) and 2) progressive cognitive decline in memory and executive function (continuous variables). 415 cognitively normal older adults were monitored annually for an average of 5 years. Cox proportional hazards models were used to assess associations between self-reported ECog items and progression to impairment (MCI or dementia). A total of 114 individuals progressed to impairment; the mean time to progression was 4.9 years (SD=3.4 years, range=0.8-13.8). Follow-up models were run controlling for depression. A subset of individuals (n=352) underwent repeat cognitive assessments for an average of 5.3 years. For those individuals, mixed effects models with random intercepts and slopes were used to assess associations between ECog items and change in neuropsychological measures of episodic memory or executive function. Prior to controlling for depression, subjective concerns on five of the eight Everyday Memory items, three of the nine Everyday Language items, one of the seven Everyday Visuospatial items, two of the five Everyday Planning items, and one of the six Everyday Organization items were associated with subsequent diagnostic conversion (HR=1.25 to 1.59, p=0.003 to 0.03). However, after controlling for depression, only two specific complaints of remembering appointments, meetings, and engagements and understanding spoken directions and instructions were associated with subsequent diagnostic conversion. Episodic memory in individuals reporting no concern on ECog items did not significantly change over time (p>0.4). More complaints on seven of the eight Everyday Memory items, three of the nine Everyday Language items, and three of the seven Everyday Visuospatial items were associated with a decline in episodic memory (Interaction estimate=-0.055 to 0.001, p=0.003 to 0.04). Executive function in those reporting no concern on ECog items declined slightly (p <0.001 to 0.06). More complaints on three of the eight Everyday Memory items and three of the nine Everyday Language items were associated with a decline in executive function (Interaction estimate=-0.021 to -0.012, p=0.002 to 0.04). These findings suggest that specific complaints across several cognitive domains are associated with diagnostic conversion. Specific complaints in the domains of Everyday Memory and Language are associated with a decline in both episodic memory and executive function. Increased monitoring and treatment of individuals with these specific SCD may be warranted.

Keywords: alzheimer’s disease, dementia, memory complaints, mild cognitive impairment, risk factors, subjective cognitive decline

Procedia PDF Downloads 67
31425 Fluorescence-Based Biosensor for Dopamine Detection Using Quantum Dots

Authors: Sylwia Krawiec, Joanna Cabaj, Karol Malecha

Abstract:

Nowadays, progress in the field of the analytical methods is of great interest for reliable biological research and medical diagnostics. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements. Chemical sensors have displaced the conventional analytical methods - sensors combine precision, sensitivity, fast response and the possibility of continuous-monitoring. Biosensor is a chemical sensor, which except of conventer also possess a biologically active material, which is the basis for the detection of specific chemicals in the sample. Each biosensor device mainly consists of two elements: a sensitive element, where is recognition of receptor-analyte, and a transducer element which receives the signal and converts it into a measurable signal. Through these two elements biosensors can be divided in two categories: due to the recognition element (e.g immunosensor) and due to the transducer (e.g optical sensor). Working of optical sensor is based on measurements of quantitative changes of parameters characterizing light radiation. The most often analyzed parameters include: amplitude (intensity), frequency or polarization. Changes in the optical properties one of the compound which reacts with biological material coated on the sensor is analyzed by a direct method, in an indirect method indicators are used, which changes the optical properties due to the transformation of the testing species. The most commonly used dyes in this method are: small molecules with an aromatic ring, like rhodamine, fluorescent proteins, for example green fluorescent protein (GFP), or nanoparticles such as quantum dots (QDs). Quantum dots have, in comparison with organic dyes, much better photoluminescent properties, better bioavailability and chemical inertness. These are semiconductor nanocrystals size of 2-10 nm. This very limited number of atoms and the ‘nano’-size gives QDs these highly fluorescent properties. Rapid and sensitive detection of dopamine is extremely important in modern medicine. Dopamine is very important neurotransmitter, which mainly occurs in the brain and central nervous system of mammals. Dopamine is responsible for the transmission information of moving through the nervous system and plays an important role in processes of learning or memory. Detection of dopamine is significant for diseases associated with the central nervous system such as Parkinson or schizophrenia. In developed optical biosensor for detection of dopamine, are used graphene quantum dots (GQDs). In such sensor dopamine molecules coats the GQD surface - in result occurs quenching of fluorescence due to Resonance Energy Transfer (FRET). Changes in fluorescence correspond to specific concentrations of the neurotransmitter in tested sample, so it is possible to accurately determine the concentration of dopamine in the sample.

Keywords: biosensor, dopamine, fluorescence, quantum dots

Procedia PDF Downloads 354