Search results for: lexical errors
537 Velocity Logs Error Reduction for In-Service Calibration of Vessel Performance Indicators
Authors: Maria Tsompanoglou, Dimitris Armenis
Abstract:
Vessel behavior in different operational and weather conditions constitutes the main area of interest for the ship operator. Ship speed and fuel consumption are the most decisive parameters in this respect, as their correlation provides information about the economic and environmental efficiency of the vessel, becoming the basis of decision making in terms of maintenance and trading. In the analysis of vessel operational profile for the evaluation of fuel consumption and the equivalent CO2 emissions footprint, the indications of Speed Through Water are widely used. The seasonal and regional variations in seawater characteristics, which are available nowadays, can provide the basis for accurate estimation of the errors in Speed Through Water indications at any time. Accuracy in the speed value on a route basis can enable operator identify the ship fuel and propulsion efficiency and proceed with improvements. This paper discusses case studies, where the actual vessel speed was corrected by a post-processing algorithm. The effects of the vessel correction to standard Key Performance Indicators, as well as operational findings not identified earlier, are also discussed.Keywords: data analytics, MATLAB, vessel performance monitoring, speed through water
Procedia PDF Downloads 298536 Problem Gambling in the Conceptualization of Health Professionals: A Qualitative Analysis of the Discourses Produced by Psychologists, Psychiatrists and General Practitioners
Authors: T. Marinaci, C. Venuleo
Abstract:
Different conceptualizations of disease affect patient care. This study aims to address this gap. It explores how health professionals conceptualize gambling problem, addiction and the goals of recovery process. In-depth, semi-structured, open-ended interviews were conducted with Italian psychologists, psychiatrists, general practitioners, and support staff (N= 114), working within health centres for the treatment of addiction (public health services or therapeutic communities) or medical offices. A Lexical Correspondence Analysis (LCA) was applied to the verbatim transcripts. LCA allowed to identify two main factorial dimensions, which organize similarity and dissimilarity in the discourses of the interviewed. The first dimension labelled 'Models of relationship with the problem', concerns two different models of relationship with the health problem: one related to the request for help and the process of taking charge and the other related to the identification of the psychopathology underlying the disorder. The second dimension, labelled 'Organisers of the intervention' reflects the dialectic between two ways to address the problem. On the one hand, they are the gambling dynamics and its immediate life-consequences to organize the intervention (whatever the request of the user is); on the other hand, they are the procedures and the tools which characterize the health service to organize the way the professionals deal with the user’ s problem (whatever it is and despite the specify of the user’s request). The results highlight how, despite the differences, the respondents share a central assumption: understanding gambling problem implies the reference to the gambler’s identity, more than, for instance, to the relational, social, cultural or political context where the gambler lives. A passive stance is attributed to the user, who does not play any role in the definition of the goal of the intervention. The results will be discussed to highlight the relationship between professional models and users’ ways to understand and deal with the problems related to gambling.Keywords: cultural models, health professionals, intervention models, problem gambling
Procedia PDF Downloads 154535 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 42534 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 160533 Argumentation Frameworks and Theories of Judging
Authors: Sonia Anand Knowlton
Abstract:
With the rise of artificial intelligence, computer science is becoming increasingly integrated in virtually every area of life. Of course, the law is no exception. Through argumentation frameworks (AFs), computer scientists have used abstract algebra to structure the legal reasoning process in a way that allows conclusions to be drawn from a formalized system of arguments. In AFs, arguments compete against each other for logical success and are related to one another through the binary operation of the attack. The prevailing arguments make up the preferred extension of the given argumentation framework, telling us what set of arguments must be accepted from a logical standpoint. There have been several developments of AFs since its original conception in the early 90’s in efforts to make them more aligned with the human reasoning process. Generally, these developments have sought to add nuance to the factors that influence the logical success of competing arguments (e.g., giving an argument more logical strength based on the underlying value it promotes). The most cogent development was that of the Extended Argumentation Framework (EAF), in which attacks can themselves be attacked by other arguments, and the promotion of different competing values can be formalized within the system. This article applies the logical structure of EAFs to current theoretical understandings of judicial reasoning to contribute to theories of judging and to the evolution of AFs simultaneously. The argument is that the main limitation of EAFs, when applied to judicial reasoning, is that they require judges to themselves assign values to different arguments and then lexically order these values to determine the given framework’s preferred extension. Drawing on John Rawls’ Theory of Justice, the examination that follows is whether values are lexical and commensurable to this extent. The analysis that follows then suggests a potential extension of the EAF system with an approach that formalizes different “planes of attack” for competing arguments that promote lexically ordered values. This article concludes with a summary of how these insights contribute to theories of judging and of legal reasoning more broadly, specifically in indeterminate cases where judges must turn to value-based approaches.Keywords: computer science, mathematics, law, legal theory, judging
Procedia PDF Downloads 59532 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 74531 The Contribution of Corpora to the Investigation of Cross-Linguistic Equivalence in Phraseology: A Contrastive Analysis of Russian and Italian Idioms
Authors: Federica Floridi
Abstract:
The long tradition of contrastive idiom research has essentially been focusing on three domains: the comparison of structural types of idioms (e.g. verbal idioms, idioms with noun-phrase structure, etc.), the description of idioms belonging to the same thematic groups (Sachgruppen), the identification of different types of cross-linguistic equivalents (i.e. full equivalents, partial equivalents, phraseological parallels, non-equivalents). The diastratic, diachronic and diatopic aspects of the compared idioms, as well as their syntactic, pragmatic and semantic properties, have been rather ignored. Corpora (both monolingual and parallel) give the opportunity to investigate the actual use of correlating idioms in authentic texts of L1 and L2. Adopting the corpus-based approach, it is possible to draw attention to the frequency of occurrence of idioms, their syntactic embedding, their potential syntactic transformations (e.g., nominalization, passivization, relativization, etc.), their combinatorial possibilities, the variations of their lexical structure, their connotations in terms of stylistic markedness or register. This paper aims to present the results of a contrastive analysis of Russian and Italian idioms referring to the concepts of ‘beginning’ and ‘end’, that has been carried out by using the Russian National Corpus and the ‘La Repubblica’ corpus. Beyond the digital corpora, bilingual dictionaries, like Skvorcova - Majzel’, Dobrovol’skaja, Kovalev, Čerdanceva, as well as monolingual resources, have been consulted. The study has shown that many of the idioms that have been traditionally indicated as cross-linguistic equivalents on bilingual dictionaries cannot be considered correspondents. The findings demonstrate that even those idioms, that are formally identical in Russian and Italian and are presumably derived from the same source (e.g., conceptual metaphor, Bible, classical mythology, World literature), exhibit differences regarding usage. The ultimate purpose of this article is to highlight that it is necessary to review and improve the existing bilingual dictionaries considering the empirical data collected in corpora. The materials gathered in this research can contribute to this sense.Keywords: corpora, cross-linguistic equivalence, idioms, Italian, Russian
Procedia PDF Downloads 146530 Incomplete Existing Algebra to Support Mathematical Computations
Authors: Ranjit Biswas
Abstract:
The existing subject Algebra is incomplete to support mathematical computations being done by scientists of all areas: Mathematics, Physics, Statistics, Chemistry, Space Science, Cosmology etc. even starting from the era of great Einstein. A huge hidden gap in the subject ‘Algebra’ is unearthed. All the scientists today, including mathematicians, physicists, chemists, statisticians, cosmologists, space scientists, and economists, even starting from the great Einstein, are lucky that they got results without facing any contradictions or without facing computational errors. Most surprising is that the results of all scientists, including Nobel Prize winners, were proved by them by doing experiments too. But in this paper, it is rigorously justified that they all are lucky. An algebraist can define an infinite number of new algebraic structures. The objective of the work in this paper is not just for the sake of defining a distinct algebraic structure, but to recognize and identify a major gap of the subject ‘Algebra’ lying hidden so far in the existing vast literature of it. The objective of this work is to fix the unearthed gap. Consequently, a different algebraic structure called ‘Region’ has been introduced, and its properties are studied.Keywords: region, ROR, RORR, region algebra
Procedia PDF Downloads 50529 A Peg Board with Photo-Reflectors to Detect Peg Insertion and Pull-Out Moments
Authors: Hiroshi Kinoshita, Yasuto Nakanishi, Ryuhei Okuno, Toshio Higashi
Abstract:
Various kinds of pegboards have been developed and used widely in research and clinics of rehabilitation for evaluation and training of patient’s hand function. A common measure in these peg boards is a total time of performance execution assessed by a tester’s stopwatch. Introduction of electrical and automatic measurement technology to the apparatus, on the other hand, has been delayed. The present work introduces the development of a pegboard with an electric sensor to detect moments of individual peg’s insertion and removal. The work also gives fundamental data obtained from a group of healthy young individuals who performed peg transfer tasks using the pegboard developed. Through trails and errors in pilot tests, two 10-hole peg-board boxes installed with a small photo-reflector and a DC amplifier at the bottom of each hole were designed and built by the present authors. The amplified electric analogue signals from the 20 reflectors were automatically digitized at 500 Hz per channel, and stored in a PC. The boxes were set on a test table at different distances (25, 50, 75, and 125 mm) in parallel to examine the effect of hole-to-hole distance. Fifty healthy young volunteers (25 in each gender) as subjects of the study performed successive fast 80 time peg transfers at each distance using their dominant and non-dominant hands. The data gathered showed a clear-cut light interruption/continuation moment by the pegs, allowing accurately (no tester’s error involved) and precisely (an order of milliseconds) to determine the pull out and insertion times of each peg. This further permitted computation of individual peg movement duration (PMD: from peg-lift-off to insertion) apart from hand reaching duration (HRD: from peg insertion to lift-off). An accidental drop of a peg led to an exceptionally long ( < mean + 3 SD) PMD, which was readily detected from an examination of data distribution. The PMD data were commonly right-skewed, suggesting that the median can be a better estimate of individual PMD than the mean. Repeated measures ANOVA using the median values revealed significant hole-to-hole distance, and hand dominance effects, suggesting that these need to be fixed in the accurate evaluation of PMD. The gender effect was non-significant. Performance consistency was also evaluated by the use of quartile variation coefficient values, which revealed no gender, hole-to-hole, and hand dominance effects. The measurement reliability was further examined using interclass correlation obtained from 14 subjects who performed the 25 and 125 mm hole distance tasks at two 7-10 days separate test sessions. Inter-class correlation values between the two tests showed fair reliability for PMD (0.65-0.75), and for HRD (0.77-0.94). We concluded that a sensor peg board developed in the present study could provide accurate (excluding tester’s errors), and precise (at a millisecond rate) time information of peg movement separated from that used for hand movement. It could also easily detect and automatically exclude erroneous execution data from his/her standard data. These would lead to a better evaluation of hand dexterity function compared to the widely used conventional used peg boards.Keywords: hand, dexterity test, peg movement time, performance consistency
Procedia PDF Downloads 132528 Variations of the Modal Characteristics of the Feeding Stage with Different Preloaded Linear Guide
Authors: Jui-Pui Hung, Yong-Run Chen, Wei-Cheng Shih, Chun-Wei Lin
Abstract:
This study was aimed to assess the variations of the modal characteristics of the feeding stage with different linear guide modulus. The dynamic characteristics of the feeding stage were characterized in terms of the modal stiffness, modal frequency and modal damping, which are assessed from the vibration tests. According to the experimental measurements, the actual preload of the linear guide modulus was found to deviate from the rated values as setting in factory. This may be due to the assemblage errors of guide modules. For the stage with linear guides, the dynamic stiffness was affected to change by the preload set on the rolling balls. The variation of the dynamic stiffness at first and second modes is 20.8 and 10.5%, respectively when the linear guide preload is adjusted from medium and high amount. But the modal damping ratio is reduced by 8.97 and 9.65%, respectively. For high-frequency mode, the modal stiffness increases by 171.2% and the damping ratio reduced by 34.4%. Current results demonstrate the importance in the determining the preloaded amount of linear guide modulus in practical application.Keywords: contact stiffness, feeding stage, linear guides, modal characteristics, pre-load
Procedia PDF Downloads 428527 Enhance Security in XML Databases: XLog File for Severity-Aware Trust-Based Access Control
Authors: A: Asmawi, L. S. Affendey, N. I. Udzir, R. Mahmod
Abstract:
The topic of enhancing security in XML databases is important as it includes protecting sensitive data and providing a secure environment to users. In order to improve security and provide dynamic access control for XML databases, we presented XLog file to calculate user trust values by recording users’ bad transaction, errors and query severities. Severity-aware trust-based access control for XML databases manages the access policy depending on users' trust values and prevents unauthorized processes, malicious transactions and insider threats. Privileges are automatically modified and adjusted over time depending on user behaviour and query severity. Logging in database is an important process and is used for recovery and security purposes. In this paper, the Xlog file is presented as a dynamic and temporary log file for XML databases to enhance the level of security.Keywords: XML database, trust-based access control, severity-aware, trust values, log file
Procedia PDF Downloads 299526 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements
Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori
Abstract:
The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.Keywords: apportionment, bias, divisor, fair, measurement
Procedia PDF Downloads 364525 Lithuanian Sign Language Literature: Metaphors at the Phonological Level
Authors: Anželika Teresė
Abstract:
In order to solve issues in sign language linguistics, address matters pertaining to maintaining high quality of sign language (SL) translation, contribute to dispelling misconceptions about SL and deaf people, and raise awareness and understanding of the deaf community heritage, this presentation discusses literature in Lithuanian Sign Language (LSL) and inherent metaphors that are created by using the phonological parameter –handshape, location, movement, palm orientation and nonmanual features. The study covered in this presentation is twofold, involving both the micro-level analysis of metaphors in terms of phonological parameters as a sub-lexical feature and the macro-level analysis of the poetic context. Cognitive theories underlie research of metaphors in sign language literature in a range of SL. The study follows this practice. The presentation covers the qualitative analysis of 34 pieces of LSL literature. The analysis employs ELAN software widely used in SL research. The target is to examine how specific types of each phonological parameter are used for the creation of metaphors in LSL literature and what metaphors are created. The results of the study show that LSL literature employs a range of metaphors created by using classifier signs and by modifying the established signs. The study also reveals that LSL literature tends to create reference metaphors indicating status and power. As the study shows, LSL poets metaphorically encode status by encoding another meaning in the same sign, which results in creating double metaphors. The metaphor of identity has been determined. Notably, the poetic context has revealed that the latter metaphor can also be identified as a metaphor for life. The study goes on to note that deaf poets create metaphors related to the importance of various phenomena significance of the lyrical subject. Notably, the study has allowed detecting locations, nonmanual features and etc., never mentioned in previous SL research as used for the creation of metaphors.Keywords: Lithuanian sign language, sign language literature, sign language metaphor, metaphor at the phonological level, cognitive linguistics
Procedia PDF Downloads 134524 Using Optimal Control Method to Investigate the Stability and Transparency of a Nonlinear Teleoperation System with Time Varying Delay
Authors: Abasali Amini, Alireza Mirbagheri, Amir Homayoun Jafari
Abstract:
In this paper, a new structure for teleoperation systems with time varying delay has been modeled and proposed. A random time varying the delay of up to 150 msec is simulated in teleoperation channel of both masters to slave and vice versa. The system stability and transparency have been investigated, comparing the result of a PID controller and an optimal controller on each master and slave sub-systems separately. The controllers have been designed in slave subsystem for reducing position errors between master and slave, and another controller has been designed in the master subsystem to establish stability, transparency and force tracking. Results have been compared together. The results showed PID controller is appropriate in position tracking, but force response oscillates in contact with the environment. We showed the optimal control established position tracking properly. Also, force tracking is achieved in this controller appropriately.Keywords: optimal control, time varying delay, teleoperation systems, stability and transparency
Procedia PDF Downloads 254523 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning
Authors: Yanwen Li, Shuguo Xie
Abstract:
In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning
Procedia PDF Downloads 263522 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 404521 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.Keywords: aerial thermography, data processing, drone, low-cost, point cloud
Procedia PDF Downloads 142520 Authorship Attribution Using Sociolinguistic Profiling When Considering Civil and Criminal Cases
Authors: Diana A. Sokolova
Abstract:
This article is devoted to one of the possibilities for identifying the author of an oral or written text - sociolinguistic profiling. Sociolinguistic profiling is utilized as a forensic linguistics technique to identify individuals through language patterns, particularly in criminal cases. It examines how social factors influence language use. This study aims to showcase the significance of linguistic profiling for attributing authorship in texts and emphasizes the necessity for its continuous enhancement while considering its strengths and weaknesses. The study employs semantic-syntactic, lexical-semantic, linguopragmatic, logical, presupposition, authorization, and content analysis methods to investigate linguistic profiling. The research highlights the relevance of sociolinguistic profiling in authorship attribution and underscores the importance of ongoing refinement of the technique, considering its limitations. This study emphasizes the practical application of linguistic profiling in legal settings and underscores the impact of social factors on language use, contributing to the field of forensic linguistics. Data collection involves collecting oral and written texts from criminal and civil court cases to analyze language patterns for authorship attribution. The collected data is analyzed using various linguistic analysis methods to identify individual characteristics and patterns that can aid in authorship attribution. The study addresses the effectiveness of sociolinguistic profiling in identifying authors of texts and explores the impact of social factors on language use in legal contexts. In spite of advantages challenges in linguistics profiling have spurred debates and controversies in academic circles, legal environments, and the public sphere. So, this research highlights the significance of sociolinguistic profiling in authorship attribution and emphasizes the need for further development of this method, considering its strengths and weaknesses.Keywords: authorship attribution, detection of identifying, dialect, features, forensic linguistics, social influence, sociolinguistics, unique speech characteristics
Procedia PDF Downloads 33519 Statistical Classification, Downscaling and Uncertainty Assessment for Global Climate Model Outputs
Authors: Queen Suraajini Rajendran, Sai Hung Cheung
Abstract:
Statistical down scaling models are required to connect the global climate model outputs and the local weather variables for climate change impact prediction. For reliable climate change impact studies, the uncertainty associated with the model including natural variability, uncertainty in the climate model(s), down scaling model, model inadequacy and in the predicted results should be quantified appropriately. In this work, a new approach is developed by the authors for statistical classification, statistical down scaling and uncertainty assessment and is applied to Singapore rainfall. It is a robust Bayesian uncertainty analysis methodology and tools based on coupling dependent modeling error with classification and statistical down scaling models in a way that the dependency among modeling errors will impact the results of both classification and statistical down scaling model calibration and uncertainty analysis for future prediction. Singapore data are considered here and the uncertainty and prediction results are obtained. From the results obtained, directions of research for improvement are briefly presented.Keywords: statistical downscaling, global climate model, climate change, uncertainty
Procedia PDF Downloads 367518 Assisted Video Colorization Using Texture Descriptors
Authors: Andre Peres Ramos, Franklin Cesar Flores
Abstract:
Colorization is the process of add colors to a monochromatic image or video. Usually, the process involves to segment the image in regions of interest and then apply colors to each one, for videos, this process is repeated for each frame, which makes it a tedious and time-consuming job. We propose a new assisted method for video colorization; the user only has to colorize one frame, and then the colors are propagated to following frames. The user can intervene at any time to correct eventual errors in color assignment. The method consists of to extract intensity and texture descriptors from the frames and then perform a feature matching to determine the best color for each segment. To reduce computation time and give a better spatial coherence we narrow the area of search and give weights for each feature to emphasize texture descriptors. To give a more natural result, we use an optimization algorithm to make the color propagation. Experimental results in several image sequences, compared to others existing methods, demonstrates that the proposed method perform a better colorization with less time and user interference.Keywords: colorization, feature matching, texture descriptors, video segmentation
Procedia PDF Downloads 160517 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine
Authors: Adriana Haulica
Abstract:
Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics
Procedia PDF Downloads 69516 Lexical Knowledge of Verb Particle Constructions with the Particle on by Mexican English Learners
Authors: Sarai Alvarado Pineda, Ricardo Maldonado Soto
Abstract:
The acquisition of Verb Particle Constructions is a challenge for Spanish speakers learning English. The acquisition is particularly difficult for speakers of languages with no verb particle constructions. The purpose of the current study is to define the procedural steps in the acquisition of constructions with the particle on. There are three outstanding meanings for the particle on; Surface: The movie is based on a true story, Activation: John turn on the light, Continuity: The band played on all night. The central aim of this study is to measure how Mexican Spanish participants respond to both the three meanings mentioned above and the degree of meaning transparency/opacity of on verb particle constructions. Forty Mexican Spanish learners of English (20 basic and 20 advanced) are compared against a control group of 20 American native English speakers through a reaction time test (PsychoPy2 2015). The participants were asked to discriminate 90 items based on their knowledge of these constructions. There are 30 items per meaning divided into two groups of transparent and opaque meaning. Results revealed three major findings: Advanced students have a reaction time similar to that of native speakers (advanced 4.5s versus native 3.7s), while students with a lower level of English proficiency, show a high reaction time (7s). Likewise, there is a shorter reaction time in constructions with lower opacity in the three groups of participants, with differences between each level (basic 6.7s, advanced 4.3s, and native 3.4s). Finally, a difference in reaction time can be identified according to the meaning provided by the construction. The reaction time for the activation category (5.27s) is greater than continuity (5.04s), and this category is also slower than the surface (4.94s). The study shows that the level of sensitivity of English learners increases significantly aiming towards native speaker patterns as determined by the level of transparency of meaning of each construction as well as the degree of entrenchment of each constructional meaning.Keywords: meaning of the particle, opacity, reaction time, verb particle constructions
Procedia PDF Downloads 263515 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting
Authors: Divya Haridas, C. B. Sobhan
Abstract:
Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate
Procedia PDF Downloads 363514 Presenting a Model for Predicting the State of Being Accident-Prone of Passages According to Neural Network and Spatial Data Analysis
Authors: Hamd Rezaeifar, Hamid Reza Sahriari
Abstract:
Accidents are considered to be one of the challenges of modern life. Due to the fact that the victims of this problem and also internal transportations are getting increased day by day in Iran, studying effective factors of accidents and identifying suitable models and parameters about this issue are absolutely essential. The main purpose of this research has been studying the factors and spatial data affecting accidents of Mashhad during 2007- 2008. In this paper it has been attempted to – through matching spatial layers on each other and finally by elaborating them with the place of accident – at the first step by adding landmarks of the accident and through adding especial fields regarding the existence or non-existence of effective phenomenon on accident, existing information banks of the accidents be completed and in the next step by means of data mining tools and analyzing by neural network, the relationship between these data be evaluated and a logical model be designed for predicting accident-prone spots with minimum error. The model of this article has a very accurate prediction in low-accident spots; yet it has more errors in accident-prone regions due to lack of primary data.Keywords: accident, data mining, neural network, GIS
Procedia PDF Downloads 45513 An Analysis on Aid for Migrants: A Descriptive Analysis on Official Development Assistance During the Migration Crisis
Authors: Elena Masi, Adolfo Morrone
Abstract:
Migration has recently become a mainstream development sector and is currently at the forefront in institutional and civil society context. However, no consensus exists on how the link between migration and development operates, that is how development is related to migration and how migration can promote development. On one hand, Official Development Assistance is recognized to be one of the levers to development. On the other hand, the debate is focusing on what should be the scope of aid programs targeting migrants groups and in general the migration process. This paper provides a descriptive analysis on how development aid for migration was allocated in the recent past, focusing on the actions that were funded and implemented by the international donor community. In the absence of an internationally shared methodology for defining the boundaries of development aid on migration, the analysis based on lexical hypotheses on the title or on the short description of initiatives funded by several Organization for Economic Co-operation and Development (OECD) countries. Moreover, the research describes and quantifies aid flows for each country according to different criteria. The terms migrant and refugee are used to identify the projects in accordance with the most internationally agreed definitions and only actions in countries of transit or of origin are considered eligible, thus excluding the amount sustained for refugees in donor countries. The results show that the percentage of projects targeting migrants, in terms of amount, has followed a growing trend from 2009 to 2016 in several European countries, and is positively correlated with the flows of migrants. Distinguishing between programs targeting migrants and programs targeting refugees, some specific national features emerge more clearly. A focus is devoted to actions targeting the root causes of migration, showing an inter-sectoral approach in international aid allocation. The analysis gives some tentative solutions to the lack of consensus on language on migration and development aid, and emphasizes the need to internationally agree on a criterion for identifying programs targeting both migrants and refugees, to make action more transparent and in order to develop effective strategies at the global level.Keywords: migration, official development assistance, ODA, refugees, time series
Procedia PDF Downloads 130512 Electrical Design Review Based on BIM-MEP Model
Authors: Michael Liu, Sen-Chou Tsai, Yu-Tang Huang, Tai-Chun Lin, Guan-Chyun Hsieh
Abstract:
This study proposes an electrical review method for mechanical, electrical, and plumbing (MEP) using building information modeling (BIM). The purpose is to reliably simplify the review work, directly evaluate the layout of electrical equipment and wiring, and calculate short-circuit current and line voltage drop based on BIM-MEP models. The study was done by MIEtech Company in collaboration with Taiwan Power Company (TPC), which is basically the unit responsible for reviewing the design of electrical appliances. This study aims to simplify the review process, reduce manual review errors, and improve the timeliness and reliability of reviews. In addition, the review system provides insight into the process and correctness of the precise integration of wiring, plumbing, and electrical equipment into the building structure, improving the safety and reliability of building electricity. In addition, it can also assist electrical engineers to use BIM to enhance the accuracy and self-detection capabilities of circuit design and improve the timeliness of the design process.Keywords: mechanical, electrical and plumbing, building information modeling, electrical review method
Procedia PDF Downloads 7511 The Use of Modern Technologies and Computers in the Archaeological Surveys of Sistan in Eastern Iran
Authors: Mahyar MehrAfarin
Abstract:
The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.Keywords: Iran, sistan, archaeological surveys, computer use, modern technologies
Procedia PDF Downloads 79510 Fall Avoidance Control of Wheeled Inverted Pendulum Type Robotic Wheelchair While Climbing Stairs
Authors: Nan Ding, Motoki Shino, Nobuyasu Tomokuni, Genki Murata
Abstract:
The wheelchair is the major means of transport for physically disabled people. However, it cannot overcome architectural barriers such as curbs and stairs. In this paper, the authors proposed a method to avoid falling down of a wheeled inverted pendulum type robotic wheelchair for climbing stairs. The problem of this system is that the feedback gain of the wheels cannot be set high due to modeling errors and gear backlash, which results in the movement of wheels. Therefore, the wheels slide down the stairs or collide with the side of the stairs, and finally the wheelchair falls down. To avoid falling down, the authors proposed a slider control strategy based on skyhook model in order to decrease the movement of wheels, and a rotary link control strategy based on the staircase dimensions in order to avoid collision or slide down. The effectiveness of the proposed fall avoidance control strategy was validated by ODE simulations and the prototype wheelchair.Keywords: EPW, fall avoidance control, skyhook, wheeled inverted pendulum
Procedia PDF Downloads 331509 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach
Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann
Abstract:
Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech
Procedia PDF Downloads 101508 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 110