Search results for: minimum data set
23949 Vegetables and Fruits Solar Tunnel Dryer for Small-Scale Farmers in Kassala
Authors: Sami Mohamed Sharif
Abstract:
The current study focuses on the design and construction of a solar tunnel dryer intended for small-scale farmers in Kassala, Sudan. To determine the appropriate dimensions of the dryer, the heat and mass balance equations are used, taking into account factors such as the target agricultural product, climate conditions, solar irradiance, and desired drying time. In Kassala, a dryer with a width of 88 cm, length of 600 cm, and height of 25 cm has been built, capable of drying up to 40 kg of vegetables or fruits. The dryer is divided into two chambers of different lengths. The air passing through is heated to the desired drying temperature in a separate heating chamber that is 200 cm long. From there, the heated air enters the drying chamber, which is 400 cm long. In this section, the agricultural product is placed on a slightly elevated net. The tunnel dryer was constructed using materials from the local market. The paper also examines the solar irradiance in Kassala, finding an average of 23.6 MJ/m2/day, with a maximum of 26.6 MJ/m2/day in April and a minimum of 20.2 MJ/m2/day in December. A DC fan powered by a 160Wp solar panel is utilized to circulate air within the tunnel. By connecting the fan and three 12V, 60W bulbs in series, four different speeds can be achieved using a speed controller. Temperature and relative humidity measurements were taken hourly over three days, from 10:00 a.m. to 3:00 p.m. The results demonstrate the promising technology and sizing techniques of solar tunnel dryers, which can significantly increase the temperature within the tunnel by more than 90%.Keywords: tunnel dryer, solar drying, moisture content, fruits drying modeling, open sun drying
Procedia PDF Downloads 5523948 Iterative Dynamic Programming for 4D Flight Trajectory Optimization
Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho
Abstract:
4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization
Procedia PDF Downloads 16223947 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes
Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani
Abstract:
The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning
Procedia PDF Downloads 40323946 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan
Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail
Abstract:
Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.Keywords: credibility, decision making, food bloggers, generation z, e-wom
Procedia PDF Downloads 7323945 Performance Measurement of Logistics Systems for Thailand's Wholesales and Retails Industries by Data Envelopment Analysis
Authors: Pornpimol Chaiwuttisak
Abstract:
The study aims to compare the performance of the logistics for Thailand’s wholesale and retail trade industries (except motor vehicles, motorcycle, and stalls) by using data (data envelopment analysis). Thailand Standard Industrial Classification in 2009 (TSIC - 2009) categories that industries into sub-group no. 45: wholesale and retail trade (except for the repair of motor vehicles and motorcycles), sub-group no. 46: wholesale trade (except motor vehicles and motorcycles), and sub-group no. 47: retail trade (except motor vehicles and motorcycles. Data used in the study is collected by the National Statistical Office, Thailand. The study consisted of four input factors include the number of companies, the number of personnel in logistics, the training cost in logistics, and outsourcing logistics management. Output factor includes the percentage of enterprises having inventory management. The results showed that the average relative efficiency of small-sized enterprises equals to 27.87 percent and 49.68 percent for the medium-sized enterprises.Keywords: DEA, wholesales and retails, logistics, Thailand
Procedia PDF Downloads 41623944 Variation of Airfoil Pressure Profile Due to Confined Air Streams: Application in Gas-Oil Separators
Authors: Amir Hossein Haji, Nabeel Al-Rawahi, Gholamreza Vakili-Nezhaad
Abstract:
An innovative design has been examined for a gas-oil separator based on pressure reduction over an airfoil surface. The primary motivations are to shorten the release trajectory of the bubbles by minimizing the thickness of the oil layer as well as improving uniform pressure reduction zones. Restricted airflow over an airfoil is investigated for its effect on the pressure drop enhancement and the maximum attainable attack angle prior to the stall condition. Aerodynamic separation is delayed based on numerical simulation of Wortmann FX 63137 Airfoil in a confined domain using FLUENT 6.3.26. The proposed set up results in higher pressure drop compared with the free stream case. With the aim of optimum power consumption we have pursued further restriction to an air jet case over the airfoil. Then, a curved strip model is suggested for the air jet which can be applied as an analysis/design tool for the best performance conditions. Pressure reduction is shown to be inversely proportional to the curvature of the upper airfoil profile. This reduction occurs within the tracking zones where the air jet is effectively attached to the airfoil surface. The zero slope condition is suggested to estimate the onset of these zones after which the minimum curvature should be searched. The corresponding zero slope curvature is applied for estimation of the maximum pressure drop which shows satisfactory agreement with the simulation results.Keywords: airfoil, air jet, curved fluid flow, gas-oil separator
Procedia PDF Downloads 47423943 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 9723942 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea
Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi
Abstract:
Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow
Procedia PDF Downloads 12323941 Application of Observational Medical Outcomes Partnership-Common Data Model (OMOP-CDM) Database in Nursing Health Problems with Prostate Cancer-a Pilot Study
Authors: Hung Lin-Zin, Lai Mei-Yen
Abstract:
Prostate cancer is the most commonly diagnosed male cancer in the U.S. The prevalence is around 1 in 8. The etiology of prostate cancer is still unknown, but some predisposing factors, such as age, black race, family history, and obesity, may increase the risk of the disease. In 2020, a total of 7,178 Taiwanese people were nearly diagnosed with prostate cancer, accounting for 5.88% of all cancer cases, and the incidence rate ranked fifth among men. In that year, the total number of deaths from prostate cancer was 1,730, accounting for 3.45% of all cancer deaths, and the death rate ranked 6th among men, accounting for 94.34% of the cases of male reproductive organs. Looking for domestic and foreign literature on the use of OMOP (Observational Medical Outcomes Partnership, hereinafter referred to as OMOP) database analysis, there are currently nearly a hundred literature published related to nursing-related health problems and nursing measures built in the OMOP general data model database of medical institutions are extremely rare. The OMOP common data model construction analysis platform is a system developed by the FDA in 2007, using a common data model (common data model, CDM) to analyze and monitor healthcare data. It is important to build up relevant nursing information from the OMOP- CDM database to assist our daily practice. Therefore, we choose prostate cancer patients who are our popular care objects and use the OMOP- CDM database to explore the common associated health problems. With the assistance of OMOP-CDM database analysis, we can expect early diagnosis and prevention of prostate cancer patients' comorbidities to improve patient care.Keywords: OMOP, nursing diagnosis, health problem, prostate cancer
Procedia PDF Downloads 6923940 Investigation of Learning Challenges in Building Measurement Unit
Authors: Argaw T. Gurmu, Muhammad N. Mahmood
Abstract:
The objective of this research is to identify the architecture and construction management students’ learning challenges of the building measurement. This research used the survey data obtained collected from the students who completed the building measurement unit. NVivo qualitative data analysis software was used to identify relevant themes. The analysis of the qualitative data revealed the major learning difficulties such as inadequacy of practice questions for the examination, inability to work as a team, lack of detailed understanding of the prerequisite units, insufficiency of the time allocated for tutorials and incompatibility of lecture and tutorial schedules. The output of this research can be used as a basis for improving the teaching and learning activities in construction measurement units.Keywords: building measurement, construction management, learning challenges, evaluate survey
Procedia PDF Downloads 13823939 Using Data-Driven Model on Online Customer Journey
Authors: Ing-Jen Hung, Tzu-Chien Wang
Abstract:
Nowadays, customers can interact with firms through miscellaneous online ads on different channels easily. In other words, customer now has innumerable options and limitless time to accomplish their commercial activities with firms, individualizing their own online customer journey. This kind of convenience emphasizes the importance of online advertisement allocation on different channels. Therefore, profound understanding of customer behavior can make considerable benefit from optimizing fund allocation on diverse ad channels. To achieve this objective, multiple firms utilize numerical methodology to create data-driven advertisement policy. In our research, we aim to exploit online customer click data to discover the correlations between each channel and their sequential relations. We use LSTM to deal with sequential property of our data and compare its accuracy with other non-sequential methods, such as CART decision tree, logistic regression, etc. Besides, we also classify our customers into several groups by their behavioral characteristics to perceive the differences between all groups as customer portrait. As a result, we discover distinct customer journey under each customer portrait. Our article provides some insights into marketing research and can help firm to formulate online advertising criteria.Keywords: LSTM, customer journey, marketing, channel ads
Procedia PDF Downloads 12123938 The Design and Modeling of Intelligent Learners Assistance System (ILASS)
Authors: Jelili Kunle Adedeji, Toeb Akorede Akinbola
Abstract:
The problem of vehicle mishap as a result of miscalculation, recklessness, or malfunction of some part in a vehicle is acknowledged to be a global issue. In most of the cases, it results into death or life injuries, all over the world; the issue becomes a nightmare to the stakeholders on how to curb mishaps on our roads due to these endemic factors. Hence this research typically examined the design of a device, specifically for learners that can lead to a society of intelligent vehicles (traffic) without withdrawing the driving authority from them, unlike pre-existing systems. Though ILASS shears a lot of principle with existing advance drivers assistance systems, yet there are two fundamental differences between ILASS system and existing systems. Firstly ILASS is meant to accept continuous input from the throttle at all time such that the devices will not constraint the driving process unnecessarily and ensure a change of speed at any point in time. Secondly, it made use of a variable threshold distance between the host vehicle and front vehicle which can be set by the host driver under the constraint of road maintenance agency, who communicates the minimum possible threshold for a different lane to the host vehicle. The results obtained from the simulation of the ILASS system concluded that ILASS is a good solution to road accidents, particularly road accident which occurs as a result of driving at high speed.Keywords: front-vehicle, host-speed, threshold-distance, ILASS
Procedia PDF Downloads 18123937 A Secure Proxy Signature Scheme with Fault Tolerance Based on RSA System
Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi
Abstract:
Due to the rapid growth in modern communication systems, fault tolerance and data security are two important issues in a secure transaction. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a secure proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.Keywords: proxy signature, fault tolerance, rsa, key agreement protocol
Procedia PDF Downloads 28623936 Sensitivity Enhancement in Graphene Based Surface Plasmon Resonance (SPR) Biosensor
Authors: Angad S. Kushwaha, Rajeev Kumar, Monika Srivastava, S. K. Srivastava
Abstract:
A lot of research work is going on in the field of graphene based SPR biosensor. In the conventional SPR based biosensor, graphene is used as a biomolecular recognition element. Graphene adsorbs biomolecules due to carbon based ring structure through sp2 hybridization. The proposed SPR based biosensor configuration will open a new avenue for efficient biosensing by taking the advantage of Graphene and its fascinating nanofabrication properties. In the present study, we have studied an SPR biosensor based on graphene mediated by Zinc Oxide (ZnO) and Gold. In the proposed structure, prism (BK7) base is coated with Zinc Oxide followed by Gold and Graphene. Using the waveguide approach by transfer matrix method, the proposed structure has been investigated theoretically. We have analyzed the reflectance versus incidence angle curve using He-Ne laser of wavelength 632.8 nm. Angle, at which the reflectance is minimized, termed as SPR angle. The shift in SPR angle is responsible for biosensing. From the analysis of reflectivity curve, we have found that there is a shift in SPR angle as the biomolecules get attached on the graphene surface. This graphene layer also enhances the sensitivity of the SPR sensor as compare to the conventional sensor. The sensitivity also increases by increasing the no of graphene layer. So in our proposed biosensor we have found minimum possible reflectivity with optimum level of sensitivity.Keywords: biosensor, sensitivity, surface plasmon resonance, transfer matrix method
Procedia PDF Downloads 41923935 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies
Authors: Yalda Zarnegarnia, Shari Messinger
Abstract:
Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.Keywords: biomarker, correlation, familial paired design, ROC curve
Procedia PDF Downloads 24023934 Code-Switching among Local UCSI Stem and N-Stem Undergraduates during Knowledge Sharing
Authors: Adeela Abu Bakar, Minder Kaur, Parthaman Singh
Abstract:
In the Malaysian education system, a formal setting of English language learning takes place in a content-based classroom (CBC). Until recently, there is less study in Malaysia, which researched the effects of code-switching (CS) behaviour towards the students’ knowledge sharing (KS) with their peers. The aim of this study is to investigate the frequency, reasons, and effect that CS, from the English language to Bahasa Melayu, has among local STEM and N-STEM undergraduates towards KS in a content-based classroom. The study implies a mixed-method research design with questionnaire and interviews as the instruments. The data is collected through distribution of questionnaires and interviews with the undergraduates. The quantitative data is analysed using SPSS in simple frequencies and percentages, whereas qualitative data involves organizing the data into themes, followed by analysis. Findings found that N-STEM undergraduates code-switch more as compared to STEM undergraduates. In addition to that, both the STEM and N-STEM undergraduates agree that CS acts as a catalyst towards KS in a content-based classroom. However, they also acknowledge that excess use of CS can be a hindrance towards KS. The findings of the study can benefit STEM and N-STEM undergraduates, education policymakers, language teachers, university educators, and students with significant insights into the role of CS towards KS in a content-based classroom. Some of the recommendations that can be applied for future studies are that the number of participants can be increased, an observation to be included for the data collection.Keywords: switching, content-based classroom, content and language integrated learning, knowledge sharing, STEM and N-STEM undergraduates
Procedia PDF Downloads 13523933 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources
Authors: Jolly Puri, Shiv Prasad Yadav
Abstract:
Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)
Procedia PDF Downloads 40723932 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 38623931 Acquisition of Murcian Lexicon and Morphology by L2 Spanish Immigrants: The Role of Social Networks
Authors: Andrea Hernandez Hurtado
Abstract:
Research on social networks (SNs) -- the interactions individuals share with others has shed important light in helping to explain differential use of variable linguistic forms, both in L1s and L2s. Nevertheless, the acquisition of nonstandard L2 Spanish in the Region of Murcia, Spain, and how learners interact with other speakers while sojourning there have received little attention. Murcian Spanish (MuSp) was widely influenced by Panocho, a divergent evolution of Hispanic Latin, and differs from the more standard Peninsular Spanish (StSp) in phonology, morphology, and lexicon. For instance, speakers from this area will most likely palatalize diminutive endings, producing animalico [̩a.ni.ma.ˈli.ko] instead of animalito [̩a.ni.ma.ˈli.to] ‘little animal’. Because L1 speakers of the area produce and prefer salient regional lexicon and morphology (particularly the palatalized diminutive -ico) in their speech, the current research focuses on how international residents in the Region of Murcia use Spanish: (1) whether or not they acquire (perceptively and/or productively) any of the salient regional features of MuSp, and (2) how their SNs explain such acquisition. This study triangulates across three tasks -recognition, production, and preference- addressing both lexicon and morphology, with each task specifically created for the investigation of MuSp features. Among other variables, the effects of L1, residence, and identity are considered. As an ongoing dissertation research, data are currently being gathered through an online questionnaire. So far, 7 participants from multiple nationalities have completed the survey, although a minimum of 25 are expected to be included in the coming months. Preliminary results revealed that MuSp lexicon and morphology were successfully recognized by participants (p<.001). In terms of regional lexicon production (10.0%) and preference (47.5%), although participants showed higher percentages of StSp, results showed that international residents become aware of stigmatized lexicon and may incorporate it into their language use. Similarly, palatalized diminutives (production 14.2%, preference 19.0%) were present in their responses. The Social Network Analysis provided information about participants’ relationships with their interactants, as well as among them. Results indicated that, generally, when residents were more immersed in the culture (i.e., had more Murcian alters) they produced and preferred more regional features. This project contributes to the knowledge of language variation acquisition in L2 speakers, focusing on a stigmatized Spanish dialect and exploring how stigmatized varieties may affect L2 development. Results will show how L2 Spanish speakers’ language is affected by their stay in Murcia. This, in turn, will shed light on the role of SNs in language acquisition, the acquisition of understudied and marginalized varieties, and the role of immersion on language acquisition. As the first systematic account on the acquisition of L2 Spanish lexicon and morphology in the Region of Murcia, it lays important groundwork for further research on the connection between SNs and the acquisition of regional variants, applicable to Murcia and beyond.Keywords: international residents, L2 Spanish, lexicon, morphology, nonstandard language acquisition, social networks
Procedia PDF Downloads 7723930 A Low-Latency Quadratic Extended Domain Modular Multiplier for Bilinear Pairing Based on Non-Least Positive Multiplication
Authors: Yulong Jia, Xiang Zhang, Ziyuan Wu, Shiji Hu
Abstract:
The calculation of bilinear pairing is the core of the SM9 algorithm, which relies on the underlying prime domain algorithm and the quadratic extension domain algorithm. Among the field algorithms, modular multiplication operation is the most time-consuming part. Therefore, the underlying modular multiplication algorithm is optimized to maximize the operation speed of bilinear pairings. This paper uses a modular multiplication method based on non-least positive (NLP) combined with Karatsuba and schoolbook multiplication to improve the Montgomery algorithm. At the same time, according to the characteristics of multiplication operation in the quadratic extension domain, a quadratic extension domain FP2-NLP modular multiplication algorithm for bilinear pairings is proposed, which effectively reduces the operation time of modular multiplication in the quadratic extension domain. The sub-expanded domain Fp₂ -NLP modular multiplication algorithm effectively reduces the operation time of modular multiplication under the second-expanded domain. The multiplication unit in the quadratic extension domain is implemented using SMIC55nm process, and two different implementation architectures are designed to cope with different application scenarios. Compared with the existing related literature, The output latency of this design can reach a minimum of 15 cycles. The shortest time for calculating the (AB+CD)r⁻¹ mod form is 37.5ns, and the comprehensive area-time product (AT) is 11400. The final R-ate pairing algorithm hardware accelerator consumes 2670k equivalent logic gates and 1.8ms computing time in 55nm process.Keywords: sm9, hardware, NLP, Montgomery
Procedia PDF Downloads 723929 Being Your Own First Responder: A Training to Identify and Respond to Mental Health
Authors: Joe Voshall, Leigha Shoup
Abstract:
In 2022, the Ohio Peace Officer Training Council and the Attorney General required officers to complete a minimum of 24 hours of continued professional training for the year. Much of the training was based on Mental Health or similarly related topics. This includes Officer Wellness and Officer Mental Health. It is becoming clearer that the stigma of Officer / First Responder Mental Health is a topic that is becoming more prevalently faced. To assist officers and first responders in facing mental health issues, we are developing new training. This training will aid in recognizing mental health-related issues in officers/first responders and citizens, as well as further using the same information to better respond and interact with one another and the public. In general, society has many varying views of mental health, much of which is largely over-sensationalized by television, movies, and other forms of entertainment. There has also been a stigma in law enforcement / first responders related to mental health and being weak as a result of on-the-job-related trauma-induced struggles. It is our hope this new training will assist officers and first responders in not only positively facing and addressing their mental health but using their own experience and education to recognize signs and symptoms of mental health within individuals in the community. Further, we hope that through this recognition, officers and first responders can use their experiences and more in-depth understanding to better interact within the field and with the public. Through recognition and better understanding of mental health issues and more positive interaction with the public, additional achievements are likely to result. This includes in the removal of bias and stigma for everyone.Keywords: law enforcement, mental health, officer related mental health, trauma
Procedia PDF Downloads 16423928 AniMoveMineR: Animal Behavior Exploratory Analysis Using Association Rules Mining
Authors: Suelane Garcia Fontes, Silvio Luiz Stanzani, Pedro L. Pizzigatti Corrła Ronaldo G. Morato
Abstract:
Environmental changes and major natural disasters are most prevalent in the world due to the damage that humanity has caused to nature and these damages directly affect the lives of animals. Thus, the study of animal behavior and their interactions with the environment can provide knowledge that guides researchers and public agencies in preservation and conservation actions. Exploratory analysis of animal movement can determine the patterns of animal behavior and with technological advances the ability of animals to be tracked and, consequently, behavioral studies have been expanded. There is a lot of research on animal movement and behavior, but we note that a proposal that combines resources and allows for exploratory analysis of animal movement and provide statistical measures on individual animal behavior and its interaction with the environment is missing. The contribution of this paper is to present the framework AniMoveMineR, a unified solution that aggregates trajectory analysis and data mining techniques to explore animal movement data and provide a first step in responding questions about the animal individual behavior and their interactions with other animals over time and space. We evaluated the framework through the use of monitored jaguar data in the city of Miranda Pantanal, Brazil, in order to verify if the use of AniMoveMineR allows to identify the interaction level between these jaguars. The results were positive and provided indications about the individual behavior of jaguars and about which jaguars have the highest or lowest correlation.Keywords: data mining, data science, trajectory, animal behavior
Procedia PDF Downloads 14423927 Effect of Hollow and Solid Recycled-Poly Fibers on the Mechanical and Morphological Properties of Short-Fiber-Reinforced Polypropylene Composites
Authors: S. Kerakra, S. Bouhelal, M. Poncot
Abstract:
The aim of this study is to give a comprehensive overview of the effect of short hollow and solid recycled polyethylene terephthalate (PET) fibers in different breaking tenacities reinforced isotactic polypropylene (iPP) composites on the mechanical and morphological properties. Composites of iPP/3, 7and 10 wt% of solid and hollow recycled PET fibers were prepared by batched melt mixing in a Brabender. The incorporation of solid recycled-PET fibers in isotactic polypropylene increase Young’s modulus of iPP relatively, meanwhile it increased proportionally with hollow fibers content. An improvement of the storage modulus, and a shift up in glass transition temperatures of hollow fibers/iPP composites was determined by DMA results. The morphology of composites was determined by scanning electron microscope (SEM) and optical polarized microscopy (OM) showing a good dispersion of the hollow fibers. Also, their flexible aspect (folding, bending) was observed. But, one weak interaction between the polymer/fibers phases was shown. Polymers can be effectively reinforced with short hollow recycled PET fibers due to their characteristics like recyclability, lightweight and the flexible aspect, which allows the absorbance of the energy of a striker with a minimum damage of the matrix. Aiming to improve the affinity matrix–recycled hollow PET fibers, it is suggested the addition of compatibilizers, as maleic anhydride.Keywords: isotactic polypropylene, hollow recycled PET fibers, solid recycled-PET fibers, composites, short fiber, scanning electron microscope
Procedia PDF Downloads 27723926 A Study on Using Network Coding for Packet Transmissions in Wireless Sensor Networks
Authors: Rei-Heng Cheng, Wen-Pinn Fang
Abstract:
A wireless sensor network (WSN) is composed by a large number of sensors and one or a few base stations, where the sensor is responsible for detecting specific event information, which is sent back to the base station(s). However, how to save electricity consumption to extend the network lifetime is a problem that cannot be ignored in the wireless sensor networks. Since the sensor network is used to monitor a region or specific events, how the information can be reliably sent back to the base station is surly important. Network coding technique is often used to enhance the reliability of the network transmission. When a node needs to send out M data packets, it encodes these data with redundant data and sends out totally M + R packets. If the receiver can get any M packets out from these M + R packets, it can decode and get the original M data packets. To transmit redundant packets will certainly result in the excess energy consumption. This paper will explore relationship between the quality of wireless transmission and the number of redundant packets. Hopefully, each sensor can overhear the nearby transmissions, learn the wireless transmission quality around it, and dynamically determine the number of redundant packets used in network coding.Keywords: energy consumption, network coding, transmission reliability, wireless sensor networks
Procedia PDF Downloads 39123925 Pattern the Location and Area of Earth-Dumping Stations from Vehicle GPS Data in Taiwan
Authors: Chun-Yuan Chen, Ming-Chang Li, Xiu-Hui Wen, Yi-Ching Tu
Abstract:
The objective of this study explores GPS (Global Positioning System) applied to trace construction vehicles such as trucks or cranes, help to pattern the earth-dumping stations of traffic construction in Taiwan. Traffic construction in this research is defined as the engineering of high-speed railways, expressways, and which that distance more than kilometers. Audit the location and check the compliance with regulations of earth-dumping stations is one of important tasks in Taiwan EPA. Basically, the earth-dumping station was known as one source of particulate matter from air pollution during construction process. Due to GPS data can be analyzed quickly and be used conveniently, this study tried to find out dumping stations by modeling vehicles tracks from GPS data during work cycle of construction. The GPS data updated from 13 vehicles related to an expressway construction in central Taiwan. The GPS footprints were retrieved to Keyhole Markup Language (KML) files so that can pattern the tracks of trucks by computer applications, the data was collected about eight months- from Feb. to Oct. in 2017. The results of GPS footprints identified dumping station and outlined the areas of earthwork had been passed to the Taiwan EPA for on-site inspection. Taiwan EPA had issued advice comments to the agency which was in charge of the construction to prevent the air pollution. According to the result of this study compared to the commonly methods in inspecting environment by manual collection, the GPS with KML patterning and modeling method can consumes less time. On the other hand, through monitoring the GPS data from construction vehicles could be useful for administration to development and implementation of strategies in environmental management.Keywords: automatic management, earth-dumping station, environmental management, Global Positioning System (GPS), particulate matter, traffic construction
Procedia PDF Downloads 16423924 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies
Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan
Abstract:
The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping
Procedia PDF Downloads 9823923 Effect of Bank Specific and Macro Economic Factors on Credit Risk of Islamic Banks in Pakistan
Authors: Mati Ullah, Shams Ur Rahman
Abstract:
The purpose of this research study is to investigate the effect of macroeconomic and bank-specific factors on credit risk in Islamic banking in Pakistan. The future of financial institutions largely depends on how well they manage risks. Credit risk is an important type of risk affecting the banking sector. The current study has taken quarterly data for the period of 6 years, from 1st July 2014 to 30 Jun 2020. The data set consisted of secondary data. Data was extracted from the websites of the State Bank and World Bank and from the financial statements of the concerned banks. In this study, the Ordinary least square model was used for the analysis of the data. The results supported the hypothesis that macroeconomic factors and bank-specific factors have a significant effect on credit risk. Macroeconomic variables, Inflation and exchange rates have positive significant effects on credit risk. However, gross domestic product has a negative significant relationship with credit risk. Moreover, the corporate rate has no significant relation with credit risk. Internal variables, size, management efficiency, net profit share income and capital adequacy have been proven to influence positively and significantly the credit risk. However, loan to deposit-has a negative insignificance relationship with credit risk. The contribution of this article is that similar conclusions have been made regarding the influence of banking factors on credit risk.Keywords: credit risk, Islamic banks, macroeconomic variables, banks specific variable
Procedia PDF Downloads 1723922 Seismic Behaviour of Bi-Symmetric Buildings
Authors: Yogendra Singh, Mayur Pisode
Abstract:
Many times it is observed that in multi-storeyed buildings the dynamic properties in the two directions are similar due to which there may be a coupling between the two orthogonal modes of the building. This is particularly observed in bi-symmetric buildings (buildings with structural properties and periods approximately equal in the two directions). There is a swapping of vibrational energy between the modes in the two orthogonal directions. To avoid this coupling the draft revision of IS:1893 proposes a minimum separation of more than 15% between the frequencies of the fundamental modes in the two directions. This study explores the seismic behaviour of bi-symmetrical buildings under uniaxial and bi-axial ground motions. For this purpose, three different types of 8 storey buildings symmetric in plan are modelled. The first building has square columns, resulting in identical periods in the two directions. The second building, with rectangular columns, has a difference of 20% in periods in orthogonal directions, and the third building has half of the rectangular columns aligned in one direction and other half aligned in the other direction. The numerical analysis of the seismic response of these three buildings is performed by using a set of 22 ground motions from PEER NGA database and scaled as per FEMA P695 guidelines to represent the same level of intensity corresponding to the Design Basis Earthquake. The results are analyzed in terms of the displacement-time response of the buildings at roof level and corresponding maximum inter-storey drift ratios.Keywords: bi-symmetric buildings, design code, dynamic coupling, multi-storey buildings, seismic response
Procedia PDF Downloads 24123921 Long-Term Structural Behavior of Resilient Materials for Reduction of Floor Impact Sound
Authors: Jung-Yoon Lee, Jongmun Kim, Hyo-Jun Chang, Jung-Min Kim
Abstract:
People’s tendency towards living in apartment houses is increasing in a densely populated country. However, some residents living in apartment houses are bothered by noise coming from the houses above. In order to reduce noise pollution, the communities are increasingly imposing a bylaw, including the limitation of floor impact sound, minimum thickness of floors, and floor soundproofing solutions. This research effort focused on the specific long-time deflection of resilient materials in the floor sound insulation systems of apartment houses. The experimental program consisted of testing nine floor sound insulation specimens subjected to sustained load for 45 days. Two main parameters were considered in the experimental investigation: three types of resilient materials and magnitudes of loads. The test results indicated that the structural behavior of the floor sound insulation systems under long-time load was quite different from that the systems under short-time load. The loading period increased the deflection of floor sound insulation systems and the increasing rate of the long-time deflection of the systems with ethylene vinyl acetate was smaller than that of the systems with low density ethylene polystyrene.Keywords: resilient materials, floor sound insulation systems, long-time deflection, sustained load, noise pollution
Procedia PDF Downloads 26823920 Non-Parametric Regression over Its Parametric Couterparts with Large Sample Size
Authors: Jude Opara, Esemokumo Perewarebo Akpos
Abstract:
This paper is on non-parametric linear regression over its parametric counterparts with large sample size. Data set on anthropometric measurement of primary school pupils was taken for the analysis. The study used 50 randomly selected pupils for the study. The set of data was subjected to normality test, and it was discovered that the residuals are not normally distributed (i.e. they do not follow a Gaussian distribution) for the commonly used least squares regression method for fitting an equation into a set of (x,y)-data points using the Anderson-Darling technique. The algorithms for the nonparametric Theil’s regression are stated in this paper as well as its parametric OLS counterpart. The use of a programming language software known as “R Development” was used in this paper. From the analysis, the result showed that there exists a significant relationship between the response and the explanatory variable for both the parametric and non-parametric regression. To know the efficiency of one method over the other, the Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) are used, and it is discovered that the nonparametric regression performs better than its parametric regression counterparts due to their lower values in both the AIC and BIC. The study however recommends that future researchers should study a similar work by examining the presence of outliers in the data set, and probably expunge it if detected and re-analyze to compare results.Keywords: Theil’s regression, Bayesian information criterion, Akaike information criterion, OLS
Procedia PDF Downloads 305