Search results for: safety performance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15640

Search results for: safety performance

2770 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 436
2769 Bilingualism Contributes to Cognitive Reserve in Parkinson's Disease

Authors: Arrate Barrenechea Garro

Abstract:

Background: Bilingualism has been shown to enhance cognitive reserve and potentially delay the onset of dementia symptoms. This study investigates the impact of bilingualism on cognitive reserve and the age of diagnosis in Parkinson's Disease (PD). Methodology: The study involves 16 non-demented monolingual PD patients and 12 non-demented bilingual PD patients, matched for age, sex, and years of education. All participants are native Spanish speakers, with Spanish as their first language (L1). Cognitive performance is assessed through a neuropsychological examination covering all cognitive domains. Cognitive reserve is measured using the Cognitive Reserve Index Questionnaire (CRIq), while language proficiency is evaluated using the Bilingual Language Profile (BLP). The age at diagnosis is recorded for both monolingual and bilingual patients. Results: Bilingual PD patients demonstrate higher scores on the CRIq compared to monolingual PD patients, with significant differences between the groups. Furthermore, there is a positive correlation between cognitive reserve (CRIq) and the utilization of the second language (L2) as indicated by the BLP. Bilingual PD patients are diagnosed, on average, three years later than monolingual PD patients. Conclusion: Bilingual PD patients exhibit higher levels of cognitive reserve compared to monolingual PD patients, as indicated by the CRIq scores. The utilization of the second language (L2) is positively correlated with cognitive reserve. Bilingual PD patients are diagnosed with PD, on average, three years later than monolingual PD patients. These findings suggest that bilingualism may contribute to cognitive reserve and potentially delay the onset of clinical symptoms associated with PD. This study adds to the existing literature supporting the relationship between bilingualism and cognitive reserve. Further research in this area could provide valuable insights into the potential protective effects of bilingualism in neurodegenerative disorders.

Keywords: bilingualis, cogntiive reserve, diagnosis, parkinson's disease

Procedia PDF Downloads 99
2768 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 408
2767 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring

Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie

Abstract:

Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.

Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement

Procedia PDF Downloads 6
2766 Evaluation of Critical Rate in Mature Oil Field with Dynamic Oil Rim Fluid Contacts in the Niger Delta

Authors: Stanley Ibuchukwu Onwukwe

Abstract:

Most reservoir in mature oil fields are vulnerable to challenges of water and/or gas coning as the size of their oil column reduces due to long period of oil production. These often result to low oil production and excessive water and/or gas production. Since over 50 years of oil production in the Niger delta, it is apparent that most of the oil fields in the region have reached their mature stages, thereby susceptible to coning tendencies. As a result of these, a good number of wells have been shut-in and abandoned, with significant amount of oil left unproduced. Analysis of the movement of fluid contacts in the reservoir is a significant aspect of reservoir studies and can assist in the management of coning tendencies and production performance of reservoirs in a mature field. This study, therefore, seeks to evaluate the occurrence of coning through the movement of fluid contacts (GOC and OWC) and determine the critical rate for controlling coning tendencies in mature oil field. This study applies the principle of Nodal analysis to calibrate the thin oil column of a reservoir of a mature field, and was graphically evaluated using the Joshi’s equation of critical rate for gas-oil system and oil-water system respectively. A representative Proxy equation was developed and sensitivity analysis carried out to determine the trend of critical rate as the oil column is been depleted. The result shows the trend in the movement of the GOC and OWC, and the critical rate, beyond which will result in excessive water and gas production, resulting to decreasing oil production from the reservoir. This result of this study can be used as a first pass assessment in the development of mature oil field reservoirs anticipated to experience water and/or gas coning during production.

Keywords: coning, fluid contact movement, mature oil field, oil production

Procedia PDF Downloads 242
2765 Modified Side Plate Design to Suppress Lateral Torsional Buckling of H-Beam for Seismic Application

Authors: Erwin, Cheng-Cheng Chen, Charles J. Salim

Abstract:

One of the method to solve the lateral torsional buckling (LTB) problem is by using side plates to increased the buckling resistance of the beam. Some modifications in designing the side plates are made in this study to simplify the construction in the field and reduce the cost. At certain region, side plates are not added: (1) At the beam end to preserve some spaces for bolt installation, but the beam is strengthened by adding cover plate at both flanges and (2) at the middle span of the beam where the moment is smaller. Three small scale full span beam specimens are tested under cyclic loading to investigate the LTB resistant and the ductility of the proposed design method. Test results show that the LTB deformation can be effectively suppressed and very high ductility level can be achieved. Following the test, a finite element analysis (FEA) model is established and is verified using the test results. An intensive parametric study is conducted using the established FEA model. The analysis reveals that the length of side plates is the most important parameter determining the performance of the beam and the required side plates length is determined by some parameters which are (1) beam depth to flange width ratio, (2) beam slenderness ratio (3) strength and thickness of the side plates, (4) compactness of beam web and flange, and (5) beam yield strength. At the end of the paper, a design formula to calculate the required side plate length is suggested.

Keywords: cover plate, earthquake resistant design, lateral torsional buckling, side plate, steel structure

Procedia PDF Downloads 174
2764 A Comparative Study on Deep Learning Models for Pneumonia Detection

Authors: Hichem Sassi

Abstract:

Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.

Keywords: deep learning, computer vision, pneumonia, models, comparative study

Procedia PDF Downloads 64
2763 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 304
2762 Sustainable Design for Building Envelope in Hot Climates: A Case Study for the Role of the Dome as a Component of an Envelope in Heat Exchange

Authors: Akeel Noori Almulla Hwaish

Abstract:

Architectural design is influenced by the actual thermal behaviour of building components, and this in turn depends not only on their steady and periodic thermal characteristics, but also on exposure effects, orientation, surface colour, and climatic fluctuations at the given location. Design data and environmental parameters should be produced in an accurate way for specified locations, so that architects and engineers can confidently apply them in their design calculations that enable precise evaluation of the influence of various parameters relating to each component of the envelope, which indicates overall thermal performance of building. The present paper will be carried out with an objective of thermal behaviour assessment and characteristics of the opaque and transparent parts of one of the very unique components used as a symbolic distinguished element of building envelope, its thermal behaviour under the impact of solar temperatures, and its role in heat exchange related to a specific U-value of specified construction materials alternatives. The research method will consider the specified Hot-Dry weather and new mosque in Baghdad, Iraq as a case study. Also, data will be presented in light of the criteria of indoor thermal comfort in terms of design parameters and thermal assessment for a“model dome”. Design alternatives and considerations of energy conservation, will be discussed as well using comparative computer simulations. Findings will be incorporated to outline the conclusions clarifying the important role of the dome in heat exchange of the whole building envelope for approaching an indoor thermal comfort level and further research in the future.

Keywords: building envelope, sustainable design, dome impact, hot-climates, heat exchange

Procedia PDF Downloads 473
2761 The Effect of Self and Peer Assessment Activities in Second Language Writing: A Washback Effect Study on the Writing Growth during the Revision Phase in the Writing Process: Learners’ Perspective

Authors: Musbah Abdussayed

Abstract:

The washback effect refers to the influence of assessment on teaching and learning, and this washback effect can either be positive or negative. This study implemented, sequentially, self-assessment (SA) and peer assessment (PA) and examined the washback effect of self and peer assessment (SPA) activities on the writing growth during the revision phase in the writing process. Twenty advanced Arabic as a second language learners from a private school in the USA participated in the study. The participants composed and then revised a short Arabic story as a part of a midterm grade. Qualitative data was collected, analyzed, and synthesized from ten interviews with the learners and from the twenty learners’ post-reflective journals. The findings indicate positive washback effects on the learners’ writing growth. The PA activity enhanced descriptions and meaning, promoted creativity, and improved textual coherence, whereas the SA activity led to detecting editing issues. Furthermore, both SPA activities had washback effects in common, including helping the learners meet the writing genre conventions and developing metacognitive awareness. However, the findings also demonstrate negative washback effects on the learners’ attitudes during the revision phase in the writing process, including bias toward self-evaluation during the SA activity and reluctance to rate peers’ writing performance during the PA activity. The findings suggest that self-and peer assessment activities are essential teaching and learning tools that can be utilized sequentially to help learners tackle multiple writing areas during the revision phase in the writing process.

Keywords: self assessment, peer assessment, washback effect, second language writing, writing process

Procedia PDF Downloads 65
2760 Association of Sociodemographic Factors and Loneliness of Adolescents in China

Authors: Zihan Geng, Yifan Hou

Abstract:

Background: Loneliness is the feeling of being isolated, which is becoming increasingly common among adolescents. A cross-sectional study was performed to determine the association between loneliness and different demographics. Methods: To identify the presence of loneliness, the UCLA Loneliness Scale (Version 3) was employed. The "Questionnaire Star" in Chinese version, as the online survey on the official website, was used to distribute the self-rating questionnaires to the students in Beijing from Grade 7 to Grade 12. The questionnaire includes sociodemographic items and the UCLA Loneliness Scale. Results: Almost all of the participants exhibited “caseness” for loneliness, as defined by UCLA. Out of 266 questionnaires, 2.6% (7 in 266) students fulfilled the presence criteria for a low degree of loneliness. 29.7% (79 in 266) of adolescents met the criteria for a moderate degree of loneliness. Moreover, 62.8% (167 in 266) and 4.9% (13 in 266) of students fulfilled the presence criteria for a moderately high and high degree of loneliness, respectively. In the Pearson χ2 test, there were significant associations between loneliness and some demographic factors, including grade (P<0.001), the number of adults in the family (P=0.001), the evaluation of appearance (P=0.034), the evaluation of self-satisfaction (P<0.001), the love in family (P<0.001), academic performance (P=0.001) and emotional support from friends (P<0.001). In the multivariate logistic analysis, the number of adults (2 vs.≤1, OR=0.319, P=0.015), time spent on social media (≥4h vs. ≤1h, OR=4.862, P=0.029), emotional support of friends (more satisfied vs. dissatisfied, OR=0.363, P=0.027) were associated with loneliness. Conclusions: Our results suggest the relationship between loneliness and some sociodemographic factors, which raise the possibility to reduce the loneliness among adolescents. Therefore, the companionship of family, the encouragement from friends and regulating the time spent on social media may decrease the loneliness in adolescents.

Keywords: loneliness, adolescents, demographic factors, UCLA loneliness scale

Procedia PDF Downloads 74
2759 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 41
2758 “Divorced Women are Like Second-Hand Clothes” - Hate Language in Media Discourse (Using the Example of Electronic Media Platforms)

Authors: Sopio Totibadze

Abstract:

Although the legal framework of Georgia reflects the main principles of gender equality and is in line with the international situation (UNDP, 2018), Georgia remains a male-dominated society. This means that men prevail in many areas of social, economic, and political life, which frequently gives women a subordinate status in society and the family (UN women). According to the latest study, “violence against women and girls in Georgia is also recognized as a public problem, and it is necessary to focus on it” (UN women). Moreover, the Public Defender's report on the protection of human rights in Georgia (2019) reveals that “in the last five years, 151 women were killed in Georgia due to gender and family violence”. Sadly, these statistics have increased significantly since that time. The issue was acutely reflected in the document published by the Organization for Security and Cooperation in Europe, “Gender Hate Crime” (March 10, 2021). “Unfortunately, the rates of femicide ..... are still high in the country, and distrust of law enforcement agencies often makes such cases invisible, which requires special attention from the state.” More precisely, the cited document considers that there are frequent cases of crimes based on gender-based oppression in Georgia, which pose a threat not only to women but also to people of any gender whose desires and aspirations do not correspond to the gender norms and roles prevailing in society. According to the study, this type of crime has a “significant and lasting impact on the victim(s) and also undermines the safety and cohesion of society and gender equality”. It is well-known that language is often used as a tool for gender oppression (Rusieshvili-Cartledge and Dolidze, 2021; Totibadze, 2021). Therefore, feminist and gender studies in linguistics ultimately serve to represent the problem, reflect on it, and propose ways to solve it. Together with technical advancement in communication, a new form of discrimination has arisen- hate language against women in electronic media discourse. Due to the nature of social media and the internet, messages containing hate language can spread in seconds and reach millions of people. However, only a few know about the detrimental effects they may have on the addressee and society. This paper aims to analyse the hateful comments directed at women on various media platforms to determine (1) the linguistic strategies used while attacking women and (2) the reasons why women may fall victim to this type of hate language. The data have been collected over six months, and overall, 500 comments will be examined for the paper. Qualitative and quantitative analysis was chosen for the methodology of the study. The comments posted on various media platforms, including social media posts, articles, or pictures, have been selected manually due to several reasons, the most important being the problem of identifying hate speech as it can disguise itself in different ways- humour, memes, etc. The comments on the articles, posts, pictures, and videos selected for sociolinguistic analysis depict a woman, a taboo topic, or a scandalous event centred on a woman that triggered a lot of hatred and hate language towards the person to whom the post/article was dedicated. The study has revealed that a woman can become a victim of hatred directed at them if they do something considered to be a deviation from a societal norm, namely, get a divorce, be sexually active, be vocal about feministic values, and talk about taboos. Interestingly, people who utilize hate language are not only men trying to “normalize” the prejudiced patriarchal values but also women who are equally active in bringing down a "strong" woman. The paper also aims to raise awareness about the hate language directed at women, as being knowledgeable about the issue at hand is the first step to tackling it.

Keywords: femicide, hate language, media discourse, sociolinguistics

Procedia PDF Downloads 82
2757 Modeling Pronunciations of Arab Broca’s Aphasics Using Mosstalk Words Technique

Authors: Sadeq Al Yaari, Fayza Alhammadi, Ayman Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Saleh Al Yami

Abstract:

Background: There has been a debate in the literature over the years as to whether or not MossTalk Words program fits Arab Broca’s aphasics (BAs) due to that language differences and also the fact that the technique has not yet been used for aphasics with semantic dementia (SD aphasics). Aims: To oversimplify the above mentioned debate slightly for purposes of exposition, the purpose of the present study is to investigate the “usability” of this program as well as pictures and community as therapeutic techniques for both Arab BAs and SD aphasics. Method: The subjects of this study are two Saudi aphasics (53 and 57 years old, respectively). The former suffers from Broca’s aphasia due to a stroke, while the latter suffers from semantic dementia. Both aphasics can speak English and have used the Moss Talk Words program in addition to intensive picture-naming therapeutic sessions for two years. They were tested by one of the researchers four times (a time per six months). The families of the two subjects, in addition to their relatives and friends, played a major part in all therapeutic sessions. Conclusion: Results show that in averages across the entire therapeutic sessions, MossTalk Words program was clearly found more effective in modeling BAs’ pronunciation than that of SD aphasic. Furthermore, picture-naming intensive exercises in addition to the positive role of the community members played a major role in the progress of the two subjects’ performance.

Keywords: moss talk words, program, technique, Broca’s aphasia, semantic dementia, subjects, picture, community

Procedia PDF Downloads 43
2756 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 224
2755 Creative Element Analysis of Machinery Creativity Contest Works

Authors: Chin-Pin, Chen, Shi-Chi, Shiao, Ting-Hao, Lin

Abstract:

Current industry is facing the rapid development of new technology in the world and fierce changes of economic environment in the society so that the industry development trend gradually does not focus on labor, but leads the industry and the academic circle with innovation and creativity. The development trend in machinery industry presents the same situation. Based on the aim of Creativity White Paper, Ministry of Education in Taiwan promotes and develops various creativity contests to cope with the industry trend. Domestic students and enterprises have good performance on domestic and international creativity contests in recent years. There must be important creative elements in such creative works to win the award among so many works. Literature review and in-depth interview with five creativity contest awarded instructors are first proceeded to conclude 15 machinery creative elements, which are further compared with the creative elements of machinery awarded creative works in past five years to understand the relationship between awarded works and creative elements. The statistical analysis results show that IDEA (Industrial Design Excellence Award) contains the most creative elements among four major international creativity contests. That is, most creativity review focuses on creative elements that are comparatively stricter. Concerning the groups participating in creativity contests, enterprises consider more creative elements of the creative works than other two elements for contests. From such contest works, creative elements of “replacement or improvement”, “convenience”, and “modeling” present higher significance. It is expected that the above findings could provide domestic colleges and universities with reference for participating in creativity related contests in the future.

Keywords: machinery, creative elements, creativity contest, creativity works

Procedia PDF Downloads 440
2754 Sorption Properties of Biological Waste for Lead Ions from Aqueous Solutions

Authors: Lucia Rozumová, Ivo Šafařík, Jana Seidlerová, Pavel Kůs

Abstract:

Biosorption by biological waste materials from agriculture industry could be a cost-effective technique for removing metal ions from wastewater. The performance of new biosorbent systems, consisting of the waste matrixes which were magnetically modified by iron oxide nanoparticles, for the removal of lead ions from an aqueous solution was tested. The use of low-cost and eco-friendly adsorbents has been investigated as an ideal alternative to the current expensive methods. This article deals with the removal of metal ions from aqueous solutions by modified waste products - orange peels, sawdust, peanuts husks, used tea leaves and ground coffee sediment. Magnetically modified waste materials were suspended in methanol and then was added ferrofluid (magnetic iron oxide nanoparticles). This modification process gives the predictions for the formation of the smart materials with new properties. Prepared material was characterized by using scanning electron microscopy, specific surface area and pore size analyzer. Studies were focused on the sorption and desorption properties. The changes of iron content in magnetically modified materials after treatment were observed as well. Adsorption process has been modelled by adsorption isotherms. The results show that magnetically modified materials during the dynamic sorption and desorption are stable at the high adsorbed amount of lead ions. The results of this study indicate that the biological waste materials as sorbent with new properties are highly effective for the treatment of wastewater.

Keywords: biological waste, sorption, metal ions, ferrofluid

Procedia PDF Downloads 141
2753 Study of Aging Behavior of Parallel-Series Connection Batteries

Authors: David Chao, John Lai, Alvin Wu, Carl Wang

Abstract:

For lithium-ion batteries with multiple cell configurations, some use scenarios can cause uneven aging effects to each cell within the battery because of uneven current distribution. Hence the focus of the study is to explore the aging effect(s) on batteries with different construction designs. In order to systematically study the influence of various factors in some key battery configurations, a detailed analysis of three key battery construction factors is conducted. And those key factors are (1) terminal position; (2) cell alignment matrix; and (3) interconnect resistance between cells. In this study, the 2S2P circuitry has been set as a model multi-cell battery to set up different battery samples, and the aging behavior is studied by a cycling test to analyze the current distribution and recoverable capacity. According to the outcome of aging tests, some key findings are: (I) different cells alignment matrices can have an impact on the cycle life of the battery; (II) symmetrical structure has been identified as a critical factor that can influence the battery cycle life, and unbalanced resistance can lead to inconsistent cell aging status; (III) the terminal position has been found to contribute to the uneven current distribution, that can cause an accelerated battery aging effect; and (IV) the internal connection resistance increase can actually result in cycle life increase; however, it is noteworthy that such increase in cycle life is accompanied by a decline in battery performance. In summary, the key findings from the study can help to identify the key aging factor of multi-cell batteries, and it can be useful to effectively improve the accuracy of battery capacity predictions.

Keywords: multiple cells battery, current distribution, battery aging, cell connection

Procedia PDF Downloads 77
2752 Chemical Fingerprinting of Complex Samples With the Aid of Parallel Outlet Flow Chromatography

Authors: Xavier A. Conlan

Abstract:

Speed of analysis is a significant limitation to current high-performance liquid chromatography/mass spectrometry (HPLC/MS) and ultra-high-pressure liquid chromatography (UHPLC)/MS systems both of which are used in many forensic investigations. The flow rate limitations of MS detection require a compromise in the chromatographic flow rate, which in turn reduces throughput, and when using modern columns, a reduction in separation efficiency. Commonly, this restriction is combated through the post-column splitting of flow prior to entry into the mass spectrometer. However, this results in a loss of sensitivity and a loss in efficiency due to the post-extra column dead volume. A new chromatographic column format known as 'parallel segmented flow' involves the splitting of eluent flow within the column outlet end fitting, and in this study we present its application in order to interrogate the provenience of methamphetamine samples with mass spectrometry detection. Using parallel segmented flow, column flow rates as high as 3 mL/min were employed in the analysis of amino acids without post-column splitting to the mass spectrometer. Furthermore, when parallel segmented flow chromatography columns were employed, the sensitivity was more than twice that of conventional systems with post-column splitting when the same volume of mobile phase was passed through the detector. These finding suggest that this type of column technology will particularly enhance the capabilities of modern LC/MS enabling both high-throughput and sensitive mass spectral detection.

Keywords: chromatography, mass spectrometry methamphetamine, parallel segmented outlet flow column, forensic sciences

Procedia PDF Downloads 486
2751 Design of Replication System for Computer-Generated Hologram in Optical Component Application

Authors: Chih-Hung Chen, Yih-Shyang Cheng, Yu-Hsin Tu

Abstract:

Holographic optical elements (HOEs) have recently been one of the most suitable components in optoelectronic technology owing to the requirement of the product system with compact size. Computer-generated holography (CGH) is a well-known technology for HOEs production. In some cases, a well-designed diffractive optical element with multifunctional components is also an important issue and needed for an advanced optoelectronic system. Spatial light modulator (SLM) is one of the key components that has great capability to display CGH pattern and is widely used in various applications, such as an image projection system. As mentioned to multifunctional components, such as phase and amplitude modulation of light, high-resolution hologram with multiple-exposure procedure is also one of the suitable candidates. However, holographic recording under multiple exposures, the diffraction efficiency of the final hologram is inevitably lower than that with single exposure process. In this study, a two-step holographic recording method, including the master hologram fabrication and the replicated hologram production, will be designed. Since there exist a reduction factor M² of diffraction efficiency in multiple-exposure holograms (M multiple exposures), so it seems that single exposure would be more efficient for holograms replication. In the second step of holographic replication, a stable optical system with one-shot copying is introduced. For commercial application, one may utilize this concept of holographic copying to obtain duplications of HOEs with higher optical performance.

Keywords: holographic replication, holography, one-shot copying, optical element

Procedia PDF Downloads 154
2750 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 142
2749 Virtual Simulation as a Teaching Method for Community Health Nursing: An Investigation of Student Performance

Authors: Omar Mayyas

Abstract:

Clinical decision-making (CDM) is essential to community health nursing (CHN) education. For this reason, nursing educators are responsible for developing these skills among nursing students because nursing students are exposed to highly critical conditions after graduation. However, due to limited exposure to real-world situations, many nursing students need help developing clinical decision-making skills in this area. Therefore, the impact of Virtual Simulation (VS) on community health nursing students' clinical decision-making in nursing education has to be investigated. This study aims to examine the difference in CDM ability among CHN students who received traditional education compared to those who received VS classes, to identify the factors that may influence CDM ability differences between CHN students who received a traditional education and VS classes, and to provide recommendations for educational programs that can enhance the CDM ability of CHN students and improve the quality of care provided in community settings. A mixed-method study will conduct. A randomized controlled trial will compare the CDM ability of CHN students who received 1hr traditional class with another group who received 1hr VS scenario about diabetic patient nursing care. Sixty-four students in each group will randomly select to be exposed to the intervention from undergraduate nursing students who completed the CHN course at York University. The participants will receive the same Clinical Decision Making in Nursing Scale (CDMNS) questionnaire. The study intervention will follow the Medical Research Council (MRC) approach. SPSS and content analysis will use for data analysis.

Keywords: clinical decision-making, virtual simulation, community health nursing students, community health nursing education

Procedia PDF Downloads 66
2748 Investigating Kinetics and Mathematical Modeling of Batch Clarification Process for Non-Centrifugal Sugar Production

Authors: Divya Vats, Sanjay Mahajani

Abstract:

The clarification of sugarcane juice plays a pivotal role in the production of non-centrifugal sugar (NCS), profoundly influencing the quality of the final NCS product. In this study, we have investigated the kinetics and mathematical modeling of the batch clarification process. The turbidity of the clarified cane juice (NTU) emerges as the determinant of the end product’s color. Moreover, this parameter underscores the significance of considering other variables as performance indicators for accessing the efficacy of the clarification process. Temperature-controlled experiments were meticulously conducted in a laboratory-scale batch mode. The primary objective was to discern the essential and optimized parameters crucial for augmenting the clarity of cane juice. Additionally, we explored the impact of pH and flocculant loading on the kinetics. Particle Image Velocimetry (PIV) is employed to comprehend the particle-particle and fluid-particle interaction. This technique facilitated a comprehensive understanding, paving the way for the subsequent multiphase computational fluid dynamics (CFD) simulations using the Eulerian-Lagrangian approach in the Ansys fluent. Impressively, these simulations accurately replicated comparable velocity profiles. The final mechanism of this study helps to make a mathematical model and presents a valuable framework for transitioning from the traditional batch process to a continuous process. The ultimate aim is to attain heightened productivity and unwavering consistency in product quality.

Keywords: non-centrifugal sugar, particle image velocimetry, computational fluid dynamics, mathematical modeling, turbidity

Procedia PDF Downloads 70
2747 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 375
2746 Design of Effective Decoupling Point in Build-To-Order Systems: Focusing on Trade-Off Relation between Order-To-Delivery Lead Time and Work in Progress

Authors: Zhiyong Li, Hiroshi Katayama

Abstract:

Since 1990s, e-commerce and internet business have been grown gradually over the word and customers tend to express their demand attributes in terms of specification requirement on parts, component, product structure etc. This paper deals with designing effective decoupling points for build to order systems under e-commerce environment, which can be realized through tradeoff relation analysis between two major criteria, customer order lead time and value of work in progress. These KPIs are critical for successful BTO business, namely time-based service effectiveness on coping with customer requirements for the first issue and cost effective ness with risk aversive operations for the second issue. Approach of this paper consists of investigation of successful business standing for BTO scheme, manufacturing model development of this scheme, quantitative evaluation of proposed models by calculation of two KPI values under various decoupling point distributions and discussion of the results brought by pattern of decoupling point distribution, where some cases provide the pareto optimum performances. To extract the relevant trade-off relation between considered KPIs among 2-dimensional resultant performance, useful logic developed by former research work, i.e. Katayama and Fonseca, is applied. Obtained characteristics are evaluated as effective information for managing BTO manufacturing businesses.

Keywords: build-to-order (BTO), decoupling point, e-commerce, order-to-delivery lead time (ODLT), work in progress (WIP)

Procedia PDF Downloads 324
2745 Development of a Roadmap for Assessment the Sustainability of Buildings in Saudi Arabia Using Building Information Modeling

Authors: Ibrahim A. Al-Sulaihi, Khalid S. Al-Gahtani, Abdullah M. Al-Sugair, Aref A. Abadel

Abstract:

Achieving environmental sustainability is one of the important issues considered in many countries’ vision. Green/Sustainable building is widely used terminology for describing a friendly environmental construction. Applying sustainable practices has a significant importance in various fields, including construction field that consumes an enormous amount of resource and causes a considerable amount of waste. The need for sustainability is increased in the regions that suffering from the limitation of natural resource and extreme weather conditions such as Saudi Arabia. Since buildings designs are getting sophisticated, the need for tools, which support decision-making for sustainability issues, is increasing, especially in the design and preconstruction stages. In this context, Building Information Modeling (BIM) can aid in performing complex building performance analyses to ensure an optimized sustainable building design. Accordingly, this paper introduces a roadmap towards developing a systematic approach for presenting the sustainability of buildings using BIM. The approach includes set of main processes including; identifying the sustainability parameters that can be used for sustainability assessment in Saudi Arabia, developing sustainability assessment method that fits the special circumstances in the Kingdom, identifying the sustainability requirements and BIM functions that can be used for satisfying these requirements, and integrating these requirements with identified functions. As a result, the sustainability-BIM approach can be developed which helps designers in assessing the sustainability and exploring different design alternatives at the early stage of the construction project.

Keywords: green buildings, sustainability, BIM, rating systems, environment, Saudi Arabia

Procedia PDF Downloads 377
2744 Towards the Effectiveness/ Performance of Spatial Communication within the Composite Interior Spaces: Wayfinding System in the Saudi National Museum as a Case Study

Authors: Afnan T. Bagasi, Donia M. Bettaieb, Abeer Alsobahi

Abstract:

The wayfinding system is related to the course of the museum journey for visitors directly and indirectly. The design aspects of this system play an important role, making it an effective and communication system within the museum space. However, translating the concepts that pertain to its design, such as Intelligibility that is based on integration and connectivity in museum space design, needs more customization in the form of specific design considerations with reference to the most important approaches. Those approaches link the organizational and practical aspects to the semiotic and semantic aspects related to the space syntax by targeting the visual and perceived consistency of visitors. In this context, the study aims to identify how to apply the concept of intelligibility and clarity by employing integration and connectivity to design a wayfinding system in museums as a kind of composite interior space. Using the available plans and images to extrapolate the design considerations used to design the wayfinding system in the Saudi National Museum as a case study, a descriptive-analytical method was used to understand the basic organizational and morphological principles of the museum space through four main aspects in space design: morphological, semantic, semiotic, and pragmatic. The study's findings will assist designers, professionals, and researchers in the field of museum design in understanding the significance of the wayfinding system by delving into it through museum spaces by highlighting the essential aspects using a clear analytical method.

Keywords: wayfinding system, museum journey, intelligibility, integration, connectivity

Procedia PDF Downloads 169
2743 Micro-Rest: Extremely Short Breaks in Post-Learning Interference Support Memory Retention over the Long Term

Authors: R. Marhenke, M. Martini

Abstract:

The distraction of attentional resources after learning hinders long-term memory consolidation compared to several minutes of post-encoding inactivity in form of wakeful resting. We tested whether an 8-minute period of wakeful resting, compared to performing an adapted version of the d2 test of attention after learning, supports memory retention. Participants encoded and immediately recalled a word list followed by either an 8 minute period of wakeful resting (eyes closed, relaxed) or by performing an adapted version of the d2 test of attention (scanning and selecting specific characters while ignoring others). At the end of the experimental session (after 12-24 min) and again after 7 days, participants were required to complete a surprise free recall test of both word lists. Our results showed no significant difference in memory retention between the experimental conditions. However, we found that participants who completed the first lines of the d2 test in less than the given time limit of 20 seconds and thus had short unfilled intervals before switching to the next test line, remembered more words over the 12-24 minute and over the 7 days retention interval than participants who did not complete the first lines. This interaction occurred only for the first test lines, with the highest temporal proximity to the encoding task and not for later test lines. Differences in retention scores between groups (completed first line vs. did not complete) seem to be widely independent of the general performance in the d2 test. Implications and limitations of these exploratory findings are discussed.

Keywords: long-term memory, retroactive interference, attention, forgetting

Procedia PDF Downloads 131
2742 Agile Smartphone Porting and App Integration of Signal Processing Algorithms Obtained through Rapid Development

Authors: Marvin Chibuzo Offiah, Susanne Rosenthal, Markus Borschbach

Abstract:

Certain research projects in Computer Science often involve research on existing signal processing algorithms and developing improvements on them. Research budgets are usually limited, hence there is limited time for implementing the algorithms from scratch. It is therefore common practice, to use implementations provided by other researchers as a template. These are most commonly provided in a rapid development, i.e. 4th generation, programming language, usually Matlab. Rapid development is a common method in Computer Science research for quickly implementing and testing new developed algorithms, which is also a common task within agile project organization. The growing relevance of mobile devices in the computer market also gives rise to the need to demonstrate the successful executability and performance measurement of these algorithms on a mobile device operating system and processor, particularly on a smartphone. Open mobile systems such as Android, are most suitable for this task, which is to be performed most efficiently. Furthermore, efficiently implementing an interaction between the algorithm and a graphical user interface (GUI) that runs exclusively on the mobile device is necessary in cases where the project’s goal statement also includes such a task. This paper examines different proposed solutions for porting computer algorithms obtained through rapid development into a GUI-based smartphone Android app and evaluates their feasibilities. Accordingly, the feasible methods are tested and a short success report is given for each tested method.

Keywords: SMARTNAVI, Smartphone, App, Programming languages, Rapid Development, MATLAB, Octave, C/C++, Java, Android, NDK, SDK, Linux, Ubuntu, Emulation, GUI

Procedia PDF Downloads 477
2741 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building

Authors: Yazan Al-Kofahi, Jamal Alqawasmi.

Abstract:

In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.

Keywords: machine learning, deep learning, artificial intelligence, sustainable building

Procedia PDF Downloads 65