Search results for: modern analytical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18944

Search results for: modern analytical methods

16964 Optimization and Automation of Functional Testing with White-Box Testing Method

Authors: Reyhaneh Soltanshah, Hamid R. Zarandi

Abstract:

In order to be more efficient in industries that are related to computer systems, software testing is necessary despite spending time and money. In the embedded system software test, complete knowledge of the embedded system architecture is necessary to avoid significant costs and damages. Software tests increase the price of the final product. The aim of this article is to provide a method to reduce time and cost in tests based on program structure. First, a complete review of eleven white box test methods based on ISO/IEC/IEEE 29119 2015 and 2021 versions has been done. The proposed algorithm is designed using two versions of the 29119 standards, and some white-box testing methods that are expensive or have little coverage have been removed. On each of the functions, white box test methods were applied according to the 29119 standard and then the proposed algorithm was implemented on the functions. To speed up the implementation of the proposed method, the Unity framework has been used with some changes. Unity framework can be used in embedded software testing due to its open source and ability to implement white box test methods. The test items obtained from these two approaches were evaluated using a mathematical ratio, which in various software mining reduced between 50% and 80% of the test cost and reached the desired result with the minimum number of test items.

Keywords: embedded software, reduce costs, software testing, white-box testing

Procedia PDF Downloads 47
16963 Pedagogical Tools In The 21st Century

Authors: M. Aherrahrou

Abstract:

Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.

Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning

Procedia PDF Downloads 350
16962 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy

Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos

Abstract:

Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.

Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree

Procedia PDF Downloads 150
16961 21st-Century Middlebrow Film: A Critical Examination of the Spectator Experience in Malayalam Film

Authors: Anupama A. P.

Abstract:

The Malayalam film industry, known as Mollywood, has a rich tradition of storytelling and cultural significance within Indian cinema. Middlebrow films have emerged as a distinct influential category, particularly in the 1980s, with directors like K.G. George, who engaged with female subjectivity and drew inspiration from the ‘women’s cinema’ of the 1950s and 1960s. In recent decades, particularly post-2010, the industry has transformed significantly with a new generation of filmmakers diverging from melodrama and new wave of the past, incorporating advanced technology and modern content. This study examines the evolution and impact of Malayalam middlebrow cinema in the 21st century, focusing on post-2000 films and their influence on contemporary spectator experiences. These films appeal to a wide range of audiences without compromising on their artistic integrity, tackling social issues and personal dramas with thematic and narrative complexity. Historically, middlebrow films in Malayalam cinema have portrayed realism and addressed the socio-political climate of Kerala, blending realism with reflexivity and moving away from traditional sentimentality. This shift is evident in the new generation of Malayalam films, which present a global representation of characters and a modern treatment of individuals. To provide a comprehensive understanding of this evolution, the study analyzes a diverse selection of films such as Kerala Varma Pazhassi Raja (2009), Drishyam (2013), Maheshinte Prathikaaram (2016), Take Off (2017), and Thondimuthalum Driksakshiyum (2017) and Virus (2019) illustrating the broad thematic range and innovative narrative techniques characteristic of this genre. These films exemplify how middlebrow cinema continues to evolve, adapting to changing societal contexts and audience expectations. This research employs a theoretical methodology, drawing on cultural studies and audience reception theory, utilizing frameworks such as Bordwell’s narrative theory, Deleuze’s concept of deterritorialization, and Hall’s encoding/decoding model to analyze the changes in Malayalam middlebrow cinema and interpret the storytelling methods, spectator experience, and audience reception of these films. The findings indicate that Malayalam middlebrow cinema post-2010 offers a spectator experience that is both intellectually stimulating and broadly appealing. This study highlights the critical role of middlebrow cinema in reflecting and shaping societal values, making it a significant cultural artefact within the broader context of Indian and global cinema. By bridging entertainment with thought-provoking narratives, these films engage audiences and contribute to wider cultural discourse, making them pivotal in contemporary cinematic landscapes. To conclude, this study highlights the importance of Malayalam middle-brow cinema in influencing contemporary cinematic tastes. The nuanced and approachable narratives of post-2010 films are posited to assume an increasingly pivotal role in the future of Malayalam cinema. By providing a deeper understanding of Malayalam middlebrow cinema and its societal implications, this study enriches theoretical discourse, promotes regional cinema, and offers valuable insights into contemporary spectator experiences and the future trajectory of Malayalam cinema.

Keywords: Malayalam cinema, middlebrow cinema, spectator experience, audience reception, deterritorialization

Procedia PDF Downloads 27
16960 Economics of Oil and Its Stability in the Gulf Region

Authors: Al Mutawa A. Amir, Liaqat Ali, Faisal Ali

Abstract:

After the World War II, the world economy was disrupted and changed due to oil and its prices. The research in this paper presents the basic statistical features and economic characteristics of the Gulf economy. The main features of the Gulf economies and its heavy dependence on oil exports, its dualism between modern and traditional sectors and its rapidly increasing affluences are particularly emphasized.  In this context, the research in this paper discussed the problems of growth versus development and has attempted to draw the implications for the future economic development of this area.

Keywords: oil prices, GCC, economic growth, gulf oil

Procedia PDF Downloads 326
16959 The Role of Critical Thinking in Disease Diagnosis: A Comprehensive Review

Authors: Mohammad Al-Mousawi

Abstract:

This academic article explores the indispensable role of critical thinking in the process of diagnosing diseases. Employing a multidisciplinary approach, we delve into the cognitive skills and analytical mindset that clinicians, researchers, and healthcare professionals must employ to navigate the complexities of disease identification. By examining the integration of critical thinking within the realms of medical education, diagnostic decision-making, and technological advancements, this article aims to underscore the significance of cultivating and applying critical thinking skills in the ever-evolving landscape of healthcare.

Keywords: critical thinking, medical education, diagnostic decision-making, fostering critical thinking

Procedia PDF Downloads 65
16958 Financial Ethics: A Review of 2010 Flash Crash

Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid

Abstract:

Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.

Keywords: flash crash, market crash, stock market, stock market crash

Procedia PDF Downloads 512
16957 Analytical and Numerical Study of Formation of Sporadic E Layer with Taking into Account Horizontal and Vertical In-Homogeneity of the Horizontal Wind

Authors: Giorgi Dalakishvili, Goderdzi G. Didebulidze, Maya Todua

Abstract:

The possibility of sporadic E (Es) layer formation in the mid-latitude nighttime lower thermosphere by horizontal homogeneous and inhomogeneous (vertically and horizontally changing) winds is investigated in 3D by analytical and numerical solutions of continuity equation for dominant heavy metallic ions Fe+. The theory of influence of wind velocity direction, value, and its shear on formation of sporadic E is developed in case of presence the effect of horizontally changing wind (the effect of horizontal convergence). In this case, the horizontal wind with horizontal shear, characterized by compressibility and/or vortices, can provide an additional influence on heavy metallic ions Fe+ horizontal convergence and Es layers density, which can be formed by their vertical convergence caused as by wind direction and values and by its horizontal shear as well. The horizontal wind value and direction have significant influence on ion vertical drift velocity and its minimal negative values of divergence necessary for development of ion vertical convergence into sporadic E type layer. The horizontal wind horizontal shear, in addition to its vertical shear, also influences the ion drift velocity value and its vertical changes and correspondingly on formation of sporadic E layer and its density. The atmospheric gravity waves (AGWs), with relatively smaller horizontal wave length than planetary waves and tidal motion, can significantly influence location of ion vertical drift velocity nodes (where Es layers formation expectable) and its vertical and horizontal shear providing ion vertical convergence into thin layer. Horizontal shear can cause additional influence in the Es layers density than in the case of only wind value and vertical shear only. In this case, depending on wind direction and value in the height region of the lower thermosphere about 90-150 km occurs heavy metallic ions (Fe+) vertical convergence into thin sporadic E type layer. The horizontal wind horizontal shear also can influence on ions horizontal convergence and density and location Es layers. The AGWs modulate the horizontal wind direction and values and causes ion additional horizontal convergence, while the vertical changes (shear) causes additional vertical convergence than in the case without vertical shear. Influence of horizontal shear on sporadic E density and the importance of vertical compressibility of the lower thermosphere, which also can be influenced by AGWs, is demonstrated numerically. For the given wavelength and background wind, the predictability of formation Es layers and its possible location regions are shown. Acknowledgements: This study was funded by Georgian Shota Rustaveli National Science Foundation Grant no. FR17-357.

Keywords: in-homogeneous, sporadic E, thermosphere, wind

Procedia PDF Downloads 149
16956 Reliability and Validity of Determining Ventilatory Threshold and Respiratory Compensation Point by Near-Infrared Spectroscopy

Authors: Tso-Yen Mao, De-Yen Liu, Chun-Feng Huang

Abstract:

Purpose: This research intends to investigate the reliability and validity of ventilatory threshold (VT) and respiratory compensation point (RCP) determined by skeletal muscle hemodynamic status. Methods: One hundred healthy male (age: 22±3 yrs; height: 173.1±6.0 cm; weight: 67.1±10.5 kg) performed graded cycling exercise test which ventilatory and skeletal muscle hemodynamic data were collected simultaneously. VT and RCP were determined by combined V-slope (VE vs. VCO2) and ventilatory efficiency (VE/VO2 vs. VE/VCO2) methods. Pearson correlation, paired t-test, and Bland-Altman plots were used to analyze reliability, validity, and similarities. Statistical significance was set at α =. 05. Results: There are high test-retest correlations of VT and RCP in ventilatory or near-infrared spectroscopy (NIRS) methods (VT vs. VTNIRS: 0.95 vs. 0.94; RCP vs. RCPNIRS: 0.93 vs. 0.93, p<. 05). There are high coefficient of determination at the first timing point of O2Hb decreased (R2 = 0.88, p<. 05) with VT, and high coefficient of determination at the second timing point of O2Hb declined (R2 = 0.89, p< .05) with RCP. VO2 of VT and RCP are not significantly different between ventilatory and NIRS methods (p>. 05). Conclusion: Using NIRS method to determine VT and RCP is reliable and valid in male individuals during graded exercise. Non-invasive skeletal muscle hemodynamics monitor also can be used for controlling training intensity in the future.

Keywords: anaerobic threshold, exercise intensity, hemodynamic, NIRS

Procedia PDF Downloads 307
16955 Determining the Materiality of an Undisclosed Fact: An Onerous Duty on the Assured

Authors: Adekemi Adebowale

Abstract:

The duty of disclosure in Nigerian insurance law is in need of reform. The materiality of an undisclosed fact (notwithstanding that it was an honest and innocent non-disclosure) currently entitles insurers to avoid insurance policies, leaving an insured with an uncovered loss. While the test of materiality requires an insured to voluntarily disclose facts that will influence an insurer's decision without proper guidelines from the insurer, the insurer is only expected to prove that the undisclosed fact had influenced its judgment in fixing the premium or determining whether to accept the risk. This problem places an onerous duty on the assured to volunteer to the insurer every material fact even though the insured only has a slight idea about the mind of a hypothetical prudent insurer. This paper explores the modern approach to revisiting the problem of an insured’s pre-contractual obligation to determine material facts in Nigerian insurance law. The aim is to build upon the change in the structure of insurance contract obligations in other common law jurisdictions such as the United Kingdom. The doctrinal and comparative methodology captures the burden imposed on the insured under the existing Nigerian insurance law. It finds that the continued application of the law leaves the insured in the weakest position, and he stands to lose in a contract supposedly created for his benefit. It is apparent that if this problem remains unresolved, the over-all consequence will contribute to a significant decline in the insurance contract, which may affect the Nigerian economy. The paper aims to evaluate the risks of the continuous application of the traditional law, which does not keep with the pace of modern insurance practice. It will ultimately produce a legally compliant reform, along with a significant deviation from the archaic structure that exists in the Nigerian insurance law. This paper forms part of an on-going PhD research on "The insured’s pre-contractual duty of utmost of utmost good faith". The outcome from the research to date finds that the insured bears the burden of the obligation to act in utmost good faith where it concerns disclosure of material facts.

Keywords: disclosure, materiality, Nigeria, United Kingdom, utmost good faith

Procedia PDF Downloads 116
16954 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas

Authors: Ahmet Kayabasi, Ali Akdagli

Abstract:

In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.

Keywords: a-shaped compact microstrip antenna, artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), support vector machine (SVM)

Procedia PDF Downloads 436
16953 The Achievement Model of University Social Responsibility

Authors: Le Kang

Abstract:

On the research question of 'how to achieve USR', this contribution reflects the concept of university social responsibility, identify three achievement models of USR as the society - diversified model, the university-cooperation model, the government - compound model, also conduct a case study to explore characteristics of Chinese achievement model of USR. The contribution concludes with discussion of how the university, government and society balance demands and roles, make necessarily strategic adjustment and innovative approach to repair the shortcomings of each achievement model.

Keywords: modern university, USR, achievement model, compound model

Procedia PDF Downloads 750
16952 Theoretical and Experimental Electrostatic Potential around the M-Nitrophenol Compound

Authors: Drissi Mokhtaria, Chouaih Abdelkader, Fodil Hamzaoui

Abstract:

Our work is about a comparison of experimental and theoretical results of the electron charge density distribution and the electrostatic potential around the M-Nitrophenol Molecule (m-NPH) kwon for its interesting physical characteristics. The molecular experimental results have been obtained from a high-resolution X-ray diffraction study. Theoretical investigations were performed under the Gaussian program using the Density Functional Theory at B3LYP level of theory at 6-31G*. The multipolar model of Hansen and Coppens was used for the experimental electron charge density distribution around the molecule, while we used the DFT methods for the theoretical calculations. The electron charge density obtained in both methods allowed us to find out the different molecular properties such us the electrostatic potential and the dipole moment which were finally subject to a comparison leading to an outcome of a good matching results obtained in both methods.

Keywords: electron charge density, m-nitrophenol, nonlinear optical compound, electrostatic potential, optimized geometric

Procedia PDF Downloads 265
16951 The Evolution of Domestic Terrorism: Global Contemporary Models

Authors: Bret Brooks

Abstract:

As the international community has focused their attention in recent times on international and transnational terrorism, many nations have ignored their own domestic terrorist groups. Domestic terrorism has significantly evolved over the last 15 years and as such nation states must adequately understand their own individual issues as well as the broader worldwide perspective. Contemporary models show that obtaining peace with domestic groups is not only the end goal, but also very obtainable. By evaluating modern examples and incorporating successful strategies, countries around the world have the ability to bring about a diplomatic resolution to domestic extremism and domestic terrorism.

Keywords: domestic, evolution, peace, terrorism

Procedia PDF Downloads 513
16950 Choral Singers' Preference for Expressive Priming Techniques

Authors: Shawn Michael Condon

Abstract:

Current research on teaching expressivity mainly involves instrumentalists. This study focuses on choral singers’ preference of priming techniques based on four methods for teaching expressivity. 112 choral singers answered the survey about their preferred methods for priming expressivity (vocal modelling, using metaphor, tapping into felt emotions, and drawing on past experiences) in three conditions (active, passive, and instructor). Analysis revealed higher preference for drawing on past experience among more experienced singers. The most preferred technique in the passive and instructor roles was vocal modelling, with metaphors and tapping into felt emotions favoured in an active role. Priming techniques are often used in combination with other methods to enhance singing technique or expressivity and are dependent upon the situation, repertoire, and the preferences of the instructor and performer.

Keywords: emotion, expressivity, performance, singing, teaching

Procedia PDF Downloads 150
16949 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L)

Procedia PDF Downloads 214
16948 Analysis of the Effect of Farmers’ Socio-Economic Factors on Net Farm Income of Catfish Farmers in Kwara State, Nigeria

Authors: Olanike A. Ojo, Akindele M. Ojo, Jacob H. Tsado, Ramatu U. Kutigi

Abstract:

The study was carried out on analysis of the effect of farmers’ socio-economic factors on the net farm income of catfish farmers in Kwara State, Nigeria. Primary data were collected from selected catfish farmers with the aid of well-structured questionnaire and a multistage sampling technique was used to select 102 catfish farmers in the area. The analytical techniques involved the use of descriptive statistics and multiple regression analysis. The findings of the analysis of socio-economic characteristics of catfish farmers reveal that 60% of the catfish farmers in the study area were male gender which implied the existence of gender inequality in the area. The mean age of 47 years was an indication that they were at their economically productive age and could contribute positively to increased production of catfish in the area. Also, the mean household size was five while the mean year of experience was five. The latter implied that the farmers were experienced in fishing techniques, breeding and fish culture which would assist in generating more revenue, reduce cost of production and eventual increase in profit levels of the farmers. The result also revealed that stock capacity (X3), accessibility to credit (X7) and labour (X4) were the main determinants of catfish production in the area. In addition, farmer’s sex, household size, no of ponds, distance of the farm from market, access to credit were the main socio-economic factors influencing the net farm income of the catfish farmers in the area. The most serious constraints militating against catfish production in the study area were high mortality rate, insufficient market, inadequate credit facilities/ finance and inadequate skilled labour needed for daily production routine. Based on the findings, it is therefore recommended that, to reduce the mortality rate of catfish extension agents should organize training workshops on improved methods and techniques of raising catfish right from juvenile to market size.

Keywords: credit, income, stock, mortality

Procedia PDF Downloads 325
16947 Motivational Orientation of the Methodical System of Teaching Mathematics in Secondary Schools

Authors: M. Rodionov, Z. Dedovets

Abstract:

The article analyses the composition and structure of the motivationally oriented methodological system of teaching mathematics (purpose, content, methods, forms, and means of teaching), viewed through the prism of the student as the subject of the learning process. Particular attention is paid to the problem of methods of teaching mathematics, which are represented in the form of an ordered triad of attributes corresponding to the selected characteristics. A systematic analysis of possible options and their methodological interpretation enriched existing ideas about known methods and technologies of training, and significantly expanded their nomenclature by including previously unstudied combinations of characteristics. In addition, examples outlined in this article illustrate the possibilities of enhancing the motivational capacity of a particular method or technology in the real learning practice of teaching mathematics through more free goal-setting and varying the conditions of the problem situations. The authors recommend the implementation of different strategies according to their characteristics in teaching and learning mathematics in secondary schools.

Keywords: education, methodological system, the teaching of mathematics, students motivation

Procedia PDF Downloads 351
16946 Miniaturizing the Volumetric Titration of Free Nitric Acid in U(vi) Solutions: On the Lookout for a More Sustainable Process Radioanalytical Chemistry through Titration-On-A-Chip

Authors: Jose Neri, Fabrice Canto, Alastair Magnaldo, Laurent Guillerme, Vincent Dugas

Abstract:

A miniaturized and automated approach for the volumetric titration of free nitric acid in U(VI) solutions is presented. Free acidity measurement refers to the acidity quantification in solutions containing hydrolysable heavy metal ions such as U(VI), U(IV) or Pu(IV) without taking into account the acidity contribution from the hydrolysis of such metal ions. It is, in fact, an operation having an essential role for the control of the nuclear fuel recycling process. The main objective behind the technical optimization of the actual ‘beaker’ method was to reduce the amount of radioactive substance to be handled by the laboratory personnel, to ease the instrumentation adjustability within a glove-box environment and to allow a high-throughput analysis for conducting more cost-effective operations. The measurement technique is based on the concept of the Taylor-Aris dispersion in order to create inside of a 200 μm x 5cm circular cylindrical micro-channel a linear concentration gradient in less than a second. The proposed analytical methodology relies on the actinide complexation using pH 5.6 sodium oxalate solution and subsequent alkalimetric titration of nitric acid with sodium hydroxide. The titration process is followed with a CCD camera for fluorescence detection; the neutralization boundary can be visualized in a detection range of 500nm- 600nm thanks to the addition of a pH sensitive fluorophore. The operating principle of the developed device allows the active generation of linear concentration gradients using a single cylindrical micro channel. This feature simplifies the fabrication and ease of use of the micro device, as it does not need a complex micro channel network or passive mixers to generate the chemical gradient. Moreover, since the linear gradient is determined by the liquid reagents input pressure, its generation can be fully achieved in faster intervals than one second, being a more timely-efficient gradient generation process compared to other source-sink passive diffusion devices. The resulting linear gradient generator device was therefore adapted to perform for the first time, a volumetric titration on a chip where the amount of reagents used is fixed to the total volume of the micro channel, avoiding an important waste generation like in other flow-based titration techniques. The associated analytical method is automated and its linearity has been proven for the free acidity determination of U(VI) samples containing up to 0.5M of actinide ion and nitric acid in a concentration range of 0.5M to 3M. In addition to automation, the developed analytical methodology and technique greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing a thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight-fold. The developed device represents, therefore, a great step towards an easy-to-handle nuclear-related application, which in the short term could be used to improve laboratory safety as much as to reduce the environmental impact of the radioanalytical chain.

Keywords: free acidity, lab-on-a-chip, linear concentration gradient, Taylor-Aris dispersion, volumetric titration

Procedia PDF Downloads 384
16945 Translation Training in the AI Era

Authors: Min Gao

Abstract:

In the past year, the advent of large language models (LLMs) has brought about a revolution in the language service industry, making it possible to efficiently produce more satisfactory and higher-quality translations. This is groundbreaking news for commercial companies involved in language services since much of a translator's work can now be completed by machines. However, it may be bad news for universities that provide translation training programs. They need to confront the challenges posed by AI in education by reconsidering issues such as the reform of traditional teaching methods, the translation ethics of students, and the new demands of the job market for their graduates. This article is an exploratory study of these issues based on the author's experiences in translation teaching. The research combines methods in the form of questionnaires and interviews. The findings include: (1) students may lose their motivation to learn in the AI era, but this can be compensated for by encouragement from the lecturer; (2) Translation ethics are not a serious problem in schools, considering the strict policies and regulations in place; (3) The role of translators has evolved in the new era, necessitating a reform of the traditional teaching methods.

Keywords: job market of translation, large language model, translation ethics, translation training

Procedia PDF Downloads 63
16944 Normal Weight Obesity among Female Students: BMI as a Non-Sufficient Tool for Obesity Assessment

Authors: Krzysztof Plesiewicz, Izabela Plesiewicz, Krzysztof Chiżyński, Marzenna Zielińska

Abstract:

Background: Obesity is an independent risk factor for cardiovascular diseases. There are several anthropometric parameters proposed to estimate the level of obesity, but until now there is no agreement which one is the best predictor of cardiometabolic risk. Scientists defined metabolically obese normal weight, who suffer from metabolic abnormalities, the same as obese individuals, and defined this syndrome as normal weight obesity (NWO). Aim of the study: The aim of our study was to determine the occurrence of overweight and obesity in a cohort of young, adult women, using standard and complementary methods of obesity assessment and to indicate those, who are at risk of obesity. The second aim of our study was to test additional methods of obesity assessment and proof that body mass index using alone is not sufficient parameter of obesity assessment. Materials and methods: 384 young women, aged 18-32, were enrolled into the study. Standard anthropometric parameters (waist to hips ratio (WTH), waist to height ratio (WTHR)) and two other methods of body fat percentage measurement (BFPM) were used in the study: electrical bioimpendance analysis (BIA) and skinfold measurement test by digital fat body mass clipper (SFM). Results: In the study group 5% and 7% of participants had waist to hips ratio and accordingly waist to height ratio values connected with visceral obesity. According to BMI 14% participants were overweight and obese. Using additional methods of body fat assessment, there were 54% and 43% of obese for BIA and SMF method. In the group of participants with normal BMI and underweight (not overweight, n =340) there were individuals with the level of BFPM above the upper limit, for the BIA 49% (n =164) and for the SFM 36 % (n=125). Statistical analysis revealed strong correlation between BIA and SFM methods. Conclusion: BMI using alone is not a sufficient parameter of obesity assessment. High percentage of young women with normal BMI values seem to be normal weight obese.

Keywords: electrical bioimpedance, normal weight obesity, skin-fold measurement test, women

Procedia PDF Downloads 268
16943 Research of Database Curriculum Construction under the Environment of Massive Open Online Courses

Authors: Wang Zhanquan, Yang Zeping, Gu Chunhua, Zhu Fazhi, Guo Weibin

Abstract:

Recently, Massive Open Online Courses (MOOCs) are becoming the new trend of education. There are many problems under the environment of Database Principle curriculum teaching process in MOOCs, such as teaching ideas and theories which are out of touch with the reality, how to carry out the technical teaching and interactive practice in the MOOCs environment, thus the methods of database course under the environment of MOOCs are proposed. There are three processes to deal with problem solving in the research, which are problems proposed, problems solved, and inductive analysis. The present research includes the design of teaching contents, teaching methods in classroom, flipped classroom teaching mode under the environment of MOOCs, learning flow method and large practice homework. The database designing ability is systematically improved based on the researching methods.

Keywords: problem solving-driven, MOOCs, teaching art, learning flow;

Procedia PDF Downloads 362
16942 Drone Classification Using Classification Methods Using Conventional Model With Embedded Audio-Visual Features

Authors: Hrishi Rakshit, Pooneh Bagheri Zadeh

Abstract:

This paper investigates the performance of drone classification methods using conventional DCNN with different hyperparameters, when additional drone audio data is embedded in the dataset for training and further classification. In this paper, first a custom dataset is created using different images of drones from University of South California (USC) datasets and Leeds Beckett university datasets with embedded drone audio signal. The three well-known DCNN architectures namely, Resnet50, Darknet53 and Shufflenet are employed over the created dataset tuning their hyperparameters such as, learning rates, maximum epochs, Mini Batch size with different optimizers. Precision-Recall curves and F1 Scores-Threshold curves are used to evaluate the performance of the named classification algorithms. Experimental results show that Resnet50 has the highest efficiency compared to other DCNN methods.

Keywords: drone classifications, deep convolutional neural network, hyperparameters, drone audio signal

Procedia PDF Downloads 99
16941 Communicating Meaning through Translanguaging: The Case of Multilingual Interactions of Algerians on Facebook

Authors: F. Abdelhamid

Abstract:

Algeria is a multilingual speech community where individuals constantly mix between codes in spoken discourse. Code is used as a cover term to refer to the existing languages and language varieties which include, among others, the mother tongue of the majority Algerian Arabic, the official language Modern Standard Arabic and the foreign languages French and English. The present study explores whether Algerians mix between these codes in online communication as well. Facebook is the selected platform from which data is collected because it is the preferred social media site for most Algerians and it is the most used one. Adopting the notion of translanguaging, this study attempts explaining how users of Facebook use multilingual messages to communicate meaning. Accordingly, multilingual interactions are not approached from a pejorative perspective but rather as a creative linguistic behavior that multilingual utilize to achieve intended meanings. The study is intended as a contribution to the research on multilingualism online because although an extensive literature has investigated multilingualism in spoken discourse, limited research investigated it in the online one. Its aim is two-fold. First, it aims at ensuring that the selected platform for analysis, namely Facebook, could be a source for multilingual data to enable the qualitative analysis. This is done by measuring frequency rates of multilingual instances. Second, when enough multilingual instances are encountered, it aims at describing and interpreting some selected ones. 120 posts and 16335 comments were collected from two Facebook pages. Analysis revealed that third of the collected data are multilingual messages. Users of Facebook mixed between the four mentioned codes in writing their messages. The most frequent cases are mixing between Algerian Arabic and French and between Algerian Arabic and Modern Standard Arabic. A focused qualitative analysis followed where some examples are interpreted and explained. It seems that Algerians mix between codes when communicating online despite the fact that it is a conscious type of communication. This suggests that such behavior is not a random and corrupted way of communicating but rather an intentional and natural one.

Keywords: Algerian speech community, computer mediated communication, languages in contact, multilingualism, translanguaging

Procedia PDF Downloads 127
16940 Unsupervised Domain Adaptive Text Retrieval with Query Generation

Authors: Rui Yin, Haojie Wang, Xun Li

Abstract:

Recently, mainstream dense retrieval methods have obtained state-of-the-art results on some datasets and tasks. However, they require large amounts of training data, which is not available in most domains. The severe performance degradation of dense retrievers on new data domains has limited the use of dense retrieval methods to only a few domains with large training datasets. In this paper, we propose an unsupervised domain-adaptive approach based on query generation. First, a generative model is used to generate relevant queries for each passage in the target corpus, and then the generated queries are used for mining negative passages. Finally, the query-passage pairs are labeled with a cross-encoder and used to train a domain-adapted dense retriever. Experiments show that our approach is more robust than previous methods in target domains that require less unlabeled data.

Keywords: dense retrieval, query generation, unsupervised training, text retrieval

Procedia PDF Downloads 64
16939 Determination of Mechanical Properties of Adhesives via Digital Image Correlation (DIC) Method

Authors: Murat Demir Aydin, Elanur Celebi

Abstract:

Adhesively bonded joints are used as an alternative to traditional joining methods due to the important advantages they provide. The most important consideration in the use of adhesively bonded joints is that these joints have appropriate requirements for their use in terms of safety. In order to ensure control of this condition, damage analysis of the adhesively bonded joints should be performed by determining the mechanical properties of the adhesives. When the literature is investigated; it is generally seen that the mechanical properties of adhesives are determined by traditional measurement methods. In this study, to determine the mechanical properties of adhesives, the Digital Image Correlation (DIC) method, which can be an alternative to traditional measurement methods, has been used. The DIC method is a new optical measurement method which is used to determine the parameters of displacement and strain in an appropriate and correct way. In this study, tensile tests of Thick Adherent Shear Test (TAST) samples formed using DP410 liquid structural adhesive and steel materials and bulk tensile specimens formed using and DP410 liquid structural adhesive was performed. The displacement and strain values of the samples were determined by DIC method and the shear stress-strain curves of the adhesive for TAST specimens and the tensile strain curves of the bulk adhesive specimens were obtained. Various methods such as numerical methods are required as conventional measurement methods (strain gauge, mechanic extensometer, etc.) are not sufficient in determining the strain and displacement values of the very thin adhesive layer such as TAST samples. As a result, the DIC method removes these requirements and easily achieves displacement measurements with sufficient accuracy.

Keywords: structural adhesive, adhesively bonded joints, digital image correlation, thick adhered shear test (TAST)

Procedia PDF Downloads 312
16938 Solving Linear Systems Involved in Convex Programming Problems

Authors: Yixun Shi

Abstract:

Many interior point methods for convex programming solve an (n+m)x(n+m)linear system in each iteration. Many implementations solve this system in each iteration by considering an equivalent mXm system (4) as listed in the paper, and thus the job is reduced into solving the system (4). However, the system(4) has to be solved exactly since otherwise the error would be entirely passed onto the last m equations of the original system. Often the Cholesky factorization is computed to obtain the exact solution of (4). One Cholesky factorization is to be done in every iteration, resulting in higher computational costs. In this paper, two iterative methods for solving linear systems using vector division are combined together and embedded into interior point methods. Instead of computing one Cholesky factorization in each iteration, it requires only one Cholesky factorization in the entire procedure, thus significantly reduces the amount of computation needed for solving the problem. Based on that, a hybrid algorithm for solving convex programming problems is proposed.

Keywords: convex programming, interior point method, linear systems, vector division

Procedia PDF Downloads 396
16937 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 501
16936 Bi-Dimensional Spectral Basis

Authors: Abdelhamid Zerroug, Mlle Ismahene Sehili

Abstract:

Spectral methods are usually applied to solve uni-dimensional boundary value problems. With the advantage of the creation of multidimensional basis, we propose a new spectral method for bi-dimensional problems. In this article, we start by creating bi-spectral basis by different ways, we developed also a new relations to determine the expressions of spectral coefficients in different partial derivatives expansions. Finally, we propose the principle of a new bi-spectral method for the bi-dimensional problems.

Keywords: boundary value problems, bi-spectral methods, bi-dimensional Legendre basis, spectral method

Procedia PDF Downloads 385
16935 The Use of Religious Symbols in the Workplace: Remarks on the Latest Case Law

Authors: Susana Sousa Machado

Abstract:

The debate on the use of religious symbols has been highlighted in modern societies, especially in the field of labour relationships. As litigiousness appears to be growing, the matter requires a careful study from a legal perspective. In this context, a description and critical analysis of the most recent case law is conducted regarding the use of symbols by the employee in the workplace, delivered both by the European Court of Human Rights and by the Court of Justice of the European Union. From this comparative analysis we highlight the most relevant aspects in order to seek a common core regarding the juridical-argumentative approach of case law.

Keywords: religion, religious symbols, workplace, discrimination

Procedia PDF Downloads 413