Search results for: type checking
6934 STML: Service Type-Checking Markup Language for Services of Web Components
Authors: Saqib Rasool, Adnan N. Mian
Abstract:
Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.Keywords: REST, STML, type checking, web component
Procedia PDF Downloads 2516933 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product
Authors: Tanawat Hongthai, Dusit Thanapatay
Abstract:
This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC
Procedia PDF Downloads 2796932 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-Fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.Keywords: fact checking, claim verification, deep learning, natural language processing
Procedia PDF Downloads 606931 Deconstructing Local Area Networks Using MaatPeace
Authors: Gerald Todd
Abstract:
Recent advances in random epistemologies and ubiquitous theory have paved the way for web services. Given the current status of linear-time communication, cyberinformaticians compellingly desire the exploration of link-level acknowledgements. In order to realize this purpose, we concentrate our efforts on disconfirming that DHTs and model checking are mostly incompatible.Keywords: LAN, cyberinformatics, model checking, communication
Procedia PDF Downloads 4006930 Research and Development of Intelligent Cooling Channels Design System
Authors: Q. Niu, X. H. Zhou, W. Liu
Abstract:
The cooling channels of injection mould play a crucial role in determining the productivity of moulding process and the product quality. It’s not a simple task to design high quality cooling channels. In this paper, an intelligent cooling channels design system including automatic layout of cooling channels, interference checking and assembly of accessories is studied. Automatic layout of cooling channels using genetic algorithm is analyzed. Through integrating experience criteria of designing cooling channels, considering the factors such as the mould temperature and interference checking, the automatic layout of cooling channels is implemented. The method of checking interference based on distance constraint algorithm and the function of automatic and continuous assembly of accessories are developed and integrated into the system. Case studies demonstrate the feasibility and practicality of the intelligent design system.Keywords: injection mould, cooling channel, intelligent design, automatic layout, interference checking
Procedia PDF Downloads 4376929 A Formal Verification Approach for Linux Kernel Designing
Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan
Abstract:
Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.Keywords: formal approach, type theory, Linux Kernel, software program
Procedia PDF Downloads 1346928 Fact-checking and Political Polarization in an Emerging Democracy
Authors: Eric Agyekum, Dominic Asitanga
Abstract:
Ghana is widely considered asa beacon of democracy in sub-Saharan Africa. With a relatively free media, the country was ranked30thin the world and third in Africaon the 2021 Press Freedom Index. Despite the democratic gains, itis one of the most politically polarized nations in the world. Ghana’spolitical division is evident in the current hunglegislature, where each of the two dominant political parties has 137 members, with an independent member occupying the remaining one seat. Misinformation and fake newsthrive in systems with acuteideological and political differences(Imelda et al, 2021; Azzimonti&Fernandes, 2018; Spohr, 2017) and Ghana is no exception. The information disorder problem has been exacerbatedby the COVID-19 pandemic, with its attendant conspiracy theories and speculations, making it difficult for the media and fact-checking organizations to verifyall claims and flag false information. In Ghana, fact-checking agencies like Ghana Fact, Dubawa Ghana, and some mainstream news media organizations have been fact-checking political claims, COVID-19 conspiracy theories, and many others. However, it is not clear if the audience consumeand attach prominence to these fact-checked stories or even visit the websites of the fact-checking agencies to read the content. Nekmat (2020) opine that though the literature on fact-checking suggest that fact-checked stories can alter readers’ beliefs, very few studies have investigated the patronage and the potential of fact-checks to deter users from sharing false news with others, particularly on social media. In response to Nekmat, this study has been initiated to examine the perception and attitude of the audience in Ghana towards fact-checks. Anchored on the principles of the nudge theory, this study will investigate how fact-checked stories alters readers’ behavioural patterns. A survey will be conducted to collect data from sampled members of the Ghanaian society.Keywords: fact-checking, information disorder, nudge theory, political polarization
Procedia PDF Downloads 1386927 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems
Authors: Lei Chen, Jian Jiao, Tingdi Zhao
Abstract:
Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system
Procedia PDF Downloads 1196926 Weight Comparison of Oil and Dry Type Distribution Transformers
Authors: Murat Toren, Mehmet Çelebi
Abstract:
Reducing the weight of transformers while providing good performance, cost reduction and increased efficiency is important. Weight is one of the most significant factors in all electrical machines, and as such, many transformer design parameters are related to weight calculations. This study presents a comparison of the weight of oil type transformers and dry type transformer weight. Oil type transformers are mainly used in industry; however, dry type transformers are becoming more widespread in recent years. MATLAB is typically used for designing transformers and design parameters (rated voltages, core loss, etc.) along with design in ANSYS Maxwell. Similar to other studies, this study presented that the dry type transformer option is limited. Moreover, the commonly-used 50 kVA distribution transformers in the industry are oil type and dry type transformers are designed and considered in terms of weight. Currently, the preference for low-cost oil-type transformers would change if costs for dry-type transformer were more competitive. The aim of this study was to compare the weight of transformers, which is a substantial cost factor, and to provide an evaluation about increasing the use of dry type transformers.Keywords: weight, optimization, oil-type transformers, dry-type transformers
Procedia PDF Downloads 3516925 Behave Imbalances Comparative Checking of Children with and without Fathers between the Ages of 7 to 11 in Rasht
Authors: Farnoush Haghanipour
Abstract:
Objective: Father loss as one of the major stress factor, can causethe mental imbalances in children. It's clear that children's family condition of lacking a father is very clearly different from the condition of having a father. The goal of this research is to examine mental imbalances comparative checking in complete form and in five subsidiary categories as aggression, stress and depression, social incompatibility, anti-social behavior, and attention deficit imbalances (wackiness) do between children without father and normal ones. Method: This research is in descriptive and analytical method that reimburse to checking mental imbalances from 50 children that are student in one zone of Rasht’s education and nurture office. Material of this research is RATER behavior questionnaire (teacher form) and data analyses were did by SPSS software. Results: The results showed that there are clear different in relation with behavior imbalances between have father children and children without father and in children without a father behavior imbalance is more. Also showed that there is clearly a difference in aggression, stress, and depression and social incompatibility between children without and without fathers, and in children without a father the proportion increases. However, in antisocial behaviours and attention deficit imbalances there are not a clear difference between them. Conclusion: With upper amount of imbalance behaviour detection in children without fathers compared with children with fathers, it is essential that practitioners of society hygienic and remedy put efforts in order to primary and secondary prevention, for mental health of this group of society.Keywords: child, behave imbalances, children without father, mental imbalances
Procedia PDF Downloads 2556924 A Paradigm for Characterization and Checking of a Human Noise Behavior
Authors: Himanshu Dehra
Abstract:
This paper presents a paradigm for characterization and checking of human noise behavior. The definitions of ‘Noise’ and ‘Noise Behavior’ are devised. The concept of characterization and examining of Noise Behavior is obtained from the proposed paradigm of Psychoacoustics. The measurement of human noise behavior is discussed through definitions of noise sources and noise measurements. The noise sources, noise measurement equations and noise filters are further illustrated through examples. The theory and significance of solar energy acoustics is presented for life and its activities. Human comfort and health are correlated with human brain through physiological responses and noise protection. Examples of heat stress, intense heat, sweating and evaporation are also enumerated.Keywords: human brain, noise behavior, noise characterization, noise filters, physiological responses, psychoacoustics
Procedia PDF Downloads 5066923 Multiple-Lump-Type Solutions of the 2D Toda Equation
Authors: Jian-Ping Yu, Wen-Xiu Ma, Yong-Li Sun, Chaudry Masood Khalique
Abstract:
In this paper, a 2d Toda equation is studied, which is a classical integrable system and plays a vital role in mathematics, physics and other areas. New lump-type solution is constructed by using the Hirota bilinear method. One interesting feature of this research is that this lump-type solutions possesses two types of multiple-lump-type waves, which are one- and two-lump-type waves. Moreover, the corresponding 3d plots, density plots and contour plots are given to show the dynamical features of the obtained multiple-lump-type solutions.Keywords: 2d Toda equation, Hirota bilinear method, Lump-type solution, multiple-lump-type solution
Procedia PDF Downloads 2206922 Modernization of the Economic Price Adjustment Software
Authors: Roger L. Goodwin
Abstract:
The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures
Procedia PDF Downloads 3176921 CSRFDtool: Automated Detection and Prevention of a Reflected Cross-Site Request Forgery
Authors: Alaa A. Almarzuki, Nora A. Farraj, Aisha M. Alshiky, Omar A. Batarfi
Abstract:
The number of internet users is dramatically increased every year. Most of these users are exposed to the dangers of attackers in one way or another. The reason for this lies in the presence of many weaknesses that are not known for native users. In addition, the lack of user awareness is considered as the main reason for falling into the attackers’ snares. Cross Site Request Forgery (CSRF) has placed in the list of the most dangerous threats to security in OWASP Top Ten for 2013. CSRF is an attack that forces the user’s browser to send or perform unwanted request or action without user awareness by exploiting a valid session between the browser and the server. When CSRF attack successes, it leads to many bad consequences. An attacker may reach private and personal information and modify it. This paper aims to detect and prevent a specific type of CSRF, called reflected CSRF. In a reflected CSRF, a malicious code could be injected by the attackers. This paper explores how CSRF Detection Extension prevents the reflected CSRF by checking browser specific information. Our evaluation shows that the proposed solution succeeds in preventing this type of attack.Keywords: CSRF, CSRF detection extension, attackers, attacks
Procedia PDF Downloads 4126920 Exploring the Non-Verbalizable in Conservation Grazing: The Contradictions Illuminated by a ‘Go-Along’ Methodology
Authors: James Ormrod
Abstract:
This paper is concerned with volunteer livestock checking. Based on a pilot study consisting of ‘go-along’ interviews with livestock checkers, it argues that there are limitations to the insights that can be generated from approaches to ‘discourse analysis’ that would focus only on the verbalizable aspects of the practice. Volunteer livestock checking takes place across Europe as part of conservation projects aimed at maintaining particular habitats through the reintroduction of grazing animals. Volunteers are variously called ‘urban shepherds’, because these practices often take place on urban fringes, or ‘lookerers’, as their role is to make visual checks on the animals. Pilot research that took place on the South Downs (a chalk downland habitat on the South Coast of the UK) involved researchers accompanying volunteers as they checked on livestock. They were asked to give an account of what they were doing and then answer semi-structured interview questions. Participants drew on popular discourses on conservation and biodiversity, as framed by the local council who run the programme. They also framed their relationships to the animals in respect to the more formal limitations of their role as identified through the conservation programme. And yet these discourses, significant as they are, do not adequately explain why volunteers are drawn to, and emotionally invested in, lookering. The methodology employed allowed participants instead to gesture to features of the landscape and to recall memories, and for the researchers to see how volunteers interacted with the animals and the landscape in embodied and emotionally loaded ways. The paper argues that a psychosocial perspective that pays attention to the contradictions and tensions made visible through this methodology helps develop a fuller understanding of volunteer livestock checking as a social practice.Keywords: conservation, human-animal relations, lookering, volunteering
Procedia PDF Downloads 1326919 Information Extraction for Short-Answer Question for the University of the Cordilleras
Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo
Abstract:
Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.Keywords: information extraction, short-answer question, natural language processing, application
Procedia PDF Downloads 4266918 Comparison of Reserve Strength Ratio and Capacity Curve Parameters of Offshore Platforms with Distinct Bracing Arrangements
Authors: Aran Dezhban, Hooshang Dolatshahi Pirooz
Abstract:
The phenomenon of corrosion, especially in the Persian Gulf region, is the main cause of the deterioration of offshore platforms, due to the high corrosion of its water. This phenomenon occurs mostly in the area of water spraying, threatening the members of the first floor of the jacket, legs, and piles in this area. In the current study, the effect of bracing arrangement on the Capacity Curve and Reserve Strength Ratio of Fixed-Type Offshore Platforms is investigated. In order to continue the operation of the platform, two modes of robust and damaged structures are considered, while checking the adequacy of the platform capacity based on the allowable values of API RP-2SIM regulations. The platform in question is located in the Persian Gulf, which is modeled on the OpenSEES software. In this research, the Nonlinear Pushover Analysis has been used. After validation, the Capacity Curve of the studied platforms is obtained and then their Reserve Strength Ratio is calculated. Results are compared with the criteria in the API-2SIM regulations.Keywords: fixed-type jacket structure, structural integrity management, nonlinear pushover analysis, robust and damaged structure, reserve strength ration, capacity curve
Procedia PDF Downloads 1116917 Background Check System for Turkish IT Companies
Authors: Arzu Baloglu, Ugur Kaplancali
Abstract:
This paper focuses on Background Check Systems and Pre-Employment Screening. In our study, we attempted to make an online background checking site that will help employers when hiring employees. Our site has two types of users which are free and powered user. Free users are the employees and powered users are the employers which will hire employers. The database of the site will contain all the information about the employees and employers which are registered in the system so the employers can make a search based on their searching criteria to find the suitable employee for the job. The web site also has a comments and points system. The current employer can make comments to his/her employees and can also give them points. The comments will be shown on employee’s profile, so; when an employer searches for an employee he/she can check the points and comments of the employee to see whether he or she is capable of the job or not. The employers can also follow some employees if they desire. This paper has been designed and implemented with using ASP.NET, C# and JavaScript. The outputs have a user friendly interface. The interface also aimed to provide the useful information for Turkish Technology Companies.Keywords: background, checking, verification, human resources, online
Procedia PDF Downloads 1976916 Investigating the Relationship Between the Auditor’s Personality Type and the Quality of Financial Reporting in Companies Listed on the Tehran Stock Exchange
Authors: Seyedmohsen Mortazavi
Abstract:
The purpose of this research is to investigate the personality types of internal auditors on the quality of financial reporting in companies admitted to the Tehran Stock Exchange. Personality type is one of the issues that emphasizes the field of auditors' behavior, and this field has attracted the attention of shareholders and stock companies today, because the auditors' personality can affect the type of financial reporting and its quality. The research is applied in terms of purpose and descriptive and correlational in terms of method, and a researcher-made questionnaire was used to check the research hypotheses. The statistical population of the research is all the auditors, accountants and financial managers of the companies admitted to the Tehran Stock Exchange, and due to their large number and the uncertainty of their exact number, 384 people have been considered as a statistical sample using Morgan's table. The researcher-made questionnaire was approved by experts in the field, and then its validity and reliability were obtained using software. For the validity of the questionnaire, confirmatory factor analysis was first examined, and then using divergent and convergent validity; Fornell-Larker and cross-sectional load test of the validity of the questionnaire were confirmed; Then, the reliability of the questionnaire was examined using Cronbach's alpha and composite reliability, and the results of these two tests showed the appropriate reliability of the questionnaire. After checking the validity and reliability of the research hypotheses, PLS software was used to check the hypotheses. The results of the research showed that the personalities of internal auditors can affect the quality of financial reporting; The personalities investigated in this research include neuroticism, extroversion, flexibility, agreeableness and conscientiousness, all of these personality types can affect the quality of financial reporting.Keywords: flexibility, quality of financial reporting, agreeableness, conscientiousness
Procedia PDF Downloads 1006915 The Control of Type 2 Diabetes with Specific References to Dietary Factors
Authors: Reham Algheshairy
Abstract:
The purpose of this research study is to identify the beneficial effects of Nigella sativa seeds, cherries and Ajwah dates on blood glucose levels among people with type 2 diabetes in the KSA population and healthy people in the UK. My hypothesis questions whether or not people with type 2 diabetes can lead a healthier life using these dietary supplements.Keywords: diabetes type 2, cherry, nigella seeds, Ajwa date
Procedia PDF Downloads 4686914 On Fourier Type Integral Transform for a Class of Generalized Quotients
Authors: A. S. Issa, S. K. Q. AL-Omari
Abstract:
In this paper, we investigate certain spaces of generalized functions for the Fourier and Fourier type integral transforms. We discuss convolution theorems and establish certain spaces of distributions for the considered integrals. The new Fourier type integral is well-defined, linear, one-to-one and continuous with respect to certain types of convergences. Many properties and an inverse problem are also discussed in some details.Keywords: Boehmian, Fourier integral, Fourier type integral, generalized quotient
Procedia PDF Downloads 3636913 Implant Operation Guiding Device for Dental Surgeons
Authors: Daniel Hyun
Abstract:
Dental implants are one of the top 3 reasons to sue a dentist for malpractice. It involves dental implant complications, usually because of the angle of the implant from the surgery. At present, surgeons usually use a 3D-printed navigator that is customized for the patient’s teeth. However, those can’t be reused for other patients as they require time. Therefore, I made a guiding device to assist the surgeon in implant operations. The surgeon can input the objective of the operation, and the device constantly checks if the surgery is heading towards the objective within the set range, telling the surgeon by manipulating the LED. We tested the prototypes’ consistency and accuracy by checking the graph, average standard deviation, and the average change of the calculated angles. The accuracy of performance was also acquired by running the device and checking the outputs. My first prototype used accelerometer and gyroscope sensors from the Arduino MPU6050 sensor, getting a changeable graph, achieving 0.0295 of standard deviations, 0.25 of average change, and 66.6% accuracy of performance. The second prototype used only the gyroscope, and it got a constant graph, achieved 0.0062 of standard deviation, 0.075 of average change, and 100% accuracy of performance, indicating that the accelerometer sensor aggravated the functionality of the device. Using the gyroscope sensor allowed it to measure the orientations of separate axes without affecting each other and also increased the stability and accuracy of the measurements.Keywords: implant, guide, accelerometer, gyroscope, handpiece
Procedia PDF Downloads 416912 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river
Procedia PDF Downloads 2866911 Examining the Impact of Fake News on Mental Health of Residents in Jos Metropolis
Authors: Job Bapyibi Guyson, Bangripa Kefas
Abstract:
The advent of social media has no doubt provided platforms that facilitate the spread of fake news. The devastating impact of this does not only end with the prevalence of rumours and propaganda but also poses potential impact on individuals’ mental well-being. Therefore, this study on examining the impact of fake news on the mental health of residents in Jos metropolis among others interrogates the impact of exposure to fake news on residents' mental health. Anchored on the Cultivation Theory, the study adopted quantitative method and surveyed two the opinions of hundred (200) social media users in Jos metropolis using purposive sampling technique. The findings reveal that a significant majority of respondents perceive fake news as highly prevalent on social media, with associated feelings of anxiety and stress. The majority of the respondents express confidence in identifying fake news, though a notable proportion lacks such confidence. Strategies for managing the mental impact of encountering fake news include ignoring it, fact checking, discussing with others, reporting to platforms, and seeking professional support. Based on these insights, recommendations were proposed to address the challenges posed by fake news. These include promoting media literacy, integrating fact-checking tools, adjusting algorithms and fostering digital well-being features among others.Keywords: fake news, mental health, social media, impact
Procedia PDF Downloads 536910 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error
Procedia PDF Downloads 1426909 Effect of Perceived Importance of a Task in the Prospective Memory Task
Authors: Kazushige Wada, Mayuko Ueda
Abstract:
In the present study, we reanalyzed lapse errors in the last phase of a job, by re-counting near lapse errors and increasing the number of participants. We also examined the results of this study from the perspective of prospective memory (PM), which concerns future actions. This study was designed to investigate whether perceiving the importance of PM tasks caused lapse errors in the last phase of a job and to determine if such errors could be explained from the perspective of PM processing. Participants (N = 34) conducted a computerized clicking task, in which they clicked on 10 figures that they had learned in advance in 8 blocks of 10 trials. Participants were requested to click the check box in the start display of a block and to click the checking off box in the finishing display. This task was a PM task. As a measure of PM performance, we counted the number of omission errors caused by forgetting to check off in the finishing display, which was defined as a lapse error. The perceived importance was manipulated by different instructions. Half the participants in the highly important task condition were instructed that checking off was very important, because equipment would be overloaded if it were not done. The other half in the not important task condition was instructed only about the location and procedure for checking off. Furthermore, we controlled workload and the emotion of surprise to confirm the effect of demand capacity and attention. To manipulate emotions during the clicking task, we suddenly presented a photo of a traffic accident and the sound of a skidding car followed by an explosion. Workload was manipulated by requesting participants to press the 0 key in response to a beep. Results indicated too few forgetting induced lapse errors to be analyzed. However, there was a weak main effect of the perceived importance of the check task, in which the mouse moved to the “END” button before moving to the check box in the finishing display. Especially, the highly important task group showed more such near lapse errors, than the not important task group. Neither surprise, nor workload affected the occurrence of near lapse errors. These results imply that high perceived importance of PM tasks impair task performance. On the basis of the multiprocess framework of PM theory, we have suggested that PM task performance in this experiment relied not on monitoring PM tasks, but on spontaneous retrieving.Keywords: prospective memory, perceived importance, lapse errors, multi process framework of prospective memory.
Procedia PDF Downloads 4466908 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis
Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen
Abstract:
Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection
Procedia PDF Downloads 3056907 An Unusual Cause of Electrocardiographic Artefact: Patient's Warming Blanket
Authors: Sanjay Dhiraaj, Puneet Goyal, Aditya Kapoor, Gaurav Misra
Abstract:
In electrocardiography, an ECG artefact is used to indicate something that is not heart-made. Although technological advancements have produced monitors with the potential of providing accurate information and reliable heart rate alarms, despite this, interference of the displayed electrocardiogram still occurs. These interferences can be from the various electrical gadgets present in the operating room or electrical signals from other parts of the body. Artefacts may also occur due to poor electrode contact with the body or due to machine malfunction. Knowing these artefacts is of utmost importance so as to avoid unnecessary and unwarranted diagnostic as well as interventional procedures. We report a case of ECG artefacts occurring due to patient warming blanket and its consequences. A 20-year-old male with a preoperative diagnosis of exstrophy epispadias complex was posted for surgery under epidural and general anaesthesia. Just after endotracheal intubation, we observed nonspecific ECG changes on the monitor. At a first glance, the monitor strip revealed broad QRs complexes suggesting a ventricular bigeminal rhythm. Closer analysis revealed these to be artefacts because although the complexes were looking broad on the first glance there was clear presence of normal sinus complexes which were immediately followed by 'broad complexes' or artefacts produced by some device or connection. These broad complexes were labeled as artefacts as they were originating in the absolute refractory period of the previous normal sinus beat. It would be physiologically impossible for the myocardium to depolarize so rapidly as to produce a second QRS complex. A search for the possible reason for the artefacts was made and after deepening the plane of anaesthesia, ruling out any possible electrolyte abnormalities, checking of ECG leads and its connections, changing monitors, checking all other monitoring connections, checking for proper grounding of anaesthesia machine and OT table, we found that after switching off the patient’s warming apparatus the rhythm returned to a normal sinus one and the 'broad complexes' or artefacts disappeared. As misdiagnosis of ECG artefacts may subject patients to unnecessary diagnostic and therapeutic interventions so a thorough knowledge of the patient and monitors allow for a quick interpretation and resolution of the problem.Keywords: ECG artefacts, patient warming blanket, peri-operative arrhythmias, mobile messaging services
Procedia PDF Downloads 2716906 Native Point Defects in ZnO
Authors: A. M. Gsiea, J. P. Goss, P. R. Briddon, Ramadan. M. Al-habashi, K. M. Etmimi, Khaled. A. S. Marghani
Abstract:
Using first-principles methods based on density functional theory and pseudopotentials, we have performed a details study of native defects in ZnO. Native point defects are unlikely to be cause of the unintentional n-type conductivity. Oxygen vacancies, which considered most often been invoked as shallow donors, have high formation energies in n-type ZnO, in edition are a deep donors. Zinc interstitials are shallow donors, with high formation energies in n-type ZnO, and thus unlikely to be responsible on their own for unintentional n-type conductivity under equilibrium conditions, as well as Zn antisites which have higher formation energies than zinc interstitials. Zinc vacancies are deep acceptors with low formation energies for n-type and in which case they will not play role in p-type coductivity of ZnO. Oxygen interstitials are stable in the form of electrically inactive split interstitials as well as deep acceptors at the octahedral interstitial site under n-type conditions. Our results may provide a guide to experimental studies of point defects in ZnO.Keywords: DFT, native, n-type, ZnO
Procedia PDF Downloads 5926905 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets
Authors: O. Poleshchuk, E. Komarov
Abstract:
This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.Keywords: interval type-2 fuzzy sets, fuzzy regression, weighted interval
Procedia PDF Downloads 372