Search results for: missing data estimation
23093 Characterization of Internet Exchange Points by Using Quantitative Data
Authors: Yamba Dabone, Tounwendyam Frédéric Ouedraogo, Pengwendé Justin Kouraogo, Oumarou Sie
Abstract:
Reliable data transport over the Internet is one of the goals of researchers in the field of computer science. Data such as videos and audio files are becoming increasingly large. As a result, transporting them over the Internet is becoming difficult. Therefore, it has been important to establish a method to locally interconnect autonomous systems (AS) with each other to facilitate traffic exchange. It is in this context that Internet Exchange Points (IXPs) are set up to facilitate local and even regional traffic. They are now the lifeblood of the Internet. Therefore, it is important to think about the factors that can characterize IXPs. However, other more quantifiable characteristics can help determine the quality of an IXP. In addition, these characteristics may allow ISPs to have a clearer view of the exchange node and may also convince other networks to connect to an IXP. To that end, we define five new IXP characteristics: the attraction rate (τₐₜₜᵣ); and the peering rate (τₚₑₑᵣ); the target rate of an IXP (Objₐₜₜ); the number of IXP links (Nₗᵢₙₖ); the resistance rate τₑ𝒻𝒻 and the attraction failure rate (τ𝒻).Keywords: characteristic, autonomous system, internet service provider, internet exchange point, rate
Procedia PDF Downloads 9923092 Statistic Regression and Open Data Approach for Identifying Economic Indicators That Influence e-Commerce
Authors: Apollinaire Barme, Simon Tamayo, Arthur Gaudron
Abstract:
This paper presents a statistical approach to identify explanatory variables linearly related to e-commerce sales. The proposed methodology allows specifying a regression model in order to quantify the relevance between openly available data (economic and demographic) and national e-commerce sales. The proposed methodology consists in collecting data, preselecting input variables, performing regressions for choosing variables and models, testing and validating. The usefulness of the proposed approach is twofold: on the one hand, it allows identifying the variables that influence e- commerce sales with an accessible approach. And on the other hand, it can be used to model future sales from the input variables. Results show that e-commerce is linearly dependent on 11 economic and demographic indicators.Keywords: e-commerce, statistical modeling, regression, empirical research
Procedia PDF Downloads 22923091 A Pre-Assessment Questionnaire to Identify Healthcare Professionals’ Perception on Information Technology Implementation
Authors: Y. Atilgan Şengül
Abstract:
Health information technologies promise higher quality, safer care and much more for both patients and professionals. Despite their promise, they are costly to develop and difficult to implement. On the other hand, user acceptance and usage determine the success of implemented information technology in healthcare. This study provides a model to understand health professionals’ perception and expectation of health information technology. Extensive literature review has been conducted to determine the main factors to be measured. A questionnaire has been designed as a measurement model and submitted to the personnel of an in vitro fertilization clinic. The respondents’ degree of agreement according to five-point Likert scale was 72% for convenient access to data and 69.4% for the importance of data security. There was a significant difference in acceptance of electronic data storage for female respondents. Also, other significant differences between professions were obtained.Keywords: healthcare, health informatics, medical record system, questionnaire
Procedia PDF Downloads 17523090 Estimation of the Seismic Response Modification Coefficient in the Superframe Structural System
Authors: Ali Reza Ghanbarnezhad Ghazvini, Seyyed Hamid Reza Mosayyebi
Abstract:
In recent years, an earthquake has occurred approximately every five years in certain regions of Iran. To mitigate the impact of these seismic events, it is crucial to identify and thoroughly assess the vulnerability of buildings and infrastructure, ensuring their safety through principled reinforcement. By adopting new methods of risk assessment, we can effectively reduce the potential risks associated with future earthquakes. In our research, we have observed that the coefficient of behavior in the fourth chapter is 1.65 for the initial structure and 1.72 for the Superframe structure. This indicates that the Superframe structure can enhance the strength of the main structural members by approximately 10% through the utilization of super beams. Furthermore, based on the comparative analysis between the two structures conducted in this study, we have successfully designed a stronger structure with minimal changes in the coefficient of behavior. Additionally, this design has allowed for greater energy dissipation during seismic events, further enhancing the structure's resilience to earthquakes. By comprehensively examining and reinforcing the vulnerability of buildings and infrastructure, along with implementing advanced risk assessment techniques, we can significantly reduce casualties and damages caused by earthquakes in Iran. The findings of this study offer valuable insights for civil engineering professionals in the field of structural engineering, aiding them in designing safer and more resilient structures.Keywords: modal pushover analysis, response modification factor, high-strength concrete, concrete shear walls, high-rise building
Procedia PDF Downloads 14823089 Validation of Electrical Field Effect on Electrostatic Desalter Modeling with Experimental Laboratory Data
Authors: Fatemeh Yazdanmehr, Iulian Nistor
Abstract:
The scope of the current study is the evaluation of the electric field effect on electrostatic desalting mathematical modeling with laboratory data. This research study was focused on developing a model for an existing operation desalting unit of one of the Iranian heavy oil field with a 75 MBPD production capacity. The high temperature of inlet oil to dehydration unit reduces the oil recovery, so the mathematical modeling of desalter operation parameters is very significant. The existing production unit operating data has been used for the accuracy of the mathematical desalting plant model. The inlet oil temperature to desalter was decreased from 110 to 80°C, and the desalted electrical field was increased from 0.75 to 2.5 Kv/cm. The model result shows that the desalter parameter changes meet the water-oil specification and also the oil production and consequently annual income is increased. In addition to that, changing desalter operation conditions reduces environmental footprint because of flare gas reduction. Following to specify the accuracy of selected electrostatic desalter electrical field, laboratory data has been used. Experimental data are used to ensure the effect of electrical field change on desalter. Therefore, the lab test is done on a crude oil sample. The results include the dehydration efficiency in the presence of a demulsifier and under electrical field (0.75 Kv) conditions at various temperatures. Comparing lab experimental and electrostatic desalter mathematical model results shows 1-3 percent acceptable error which confirms the validity of desalter specification and operation conditions changes.Keywords: desalter, electrical field, demulsification, mathematical modeling, water-oil separation
Procedia PDF Downloads 14523088 Determination of Nutritional Value and Steroidal Saponin of Fenugreek Genotypes
Authors: Anita Singh, Richa Naula, Manoj Raghav
Abstract:
Nutrient rich and high-yielding varieties of fenugreek can be developed by using genotypes which are naturally high in nutrients. Gene banks harbour scanty germplasm collection of Trigonella spp. and a very little background information about its genetic diversity. The extent of genetic diversity in a specific breeding population depends upon the genotype included in it. The present investigation aims at the estimation of macronutrient (phosphorus by spectrophotometer and potassium by flame photometer), micronutrients, namely, iron, zinc, manganese, and copper from seeds of fenugreek genotypes using atomic absorption spectrophotometer, protein by Rapid N Cube Analyser and Steroidal Saponins. Twenty-eight genotypes of fenugreek along with two standard checks, namely, Pant Ragini and Pusa Early Bunching were collected from different parts of India, and nutrient contents of each genotype were determined at G. B. P. U. A. & T. Laboratory, Pantnagar. Highest potassium content was observed in PFG-35 (1207 mg/100g). PFG-37 and PFG-20 were richest in phosphorus, iron and manganese content among all the genotypes. The lowest zinc content was found in PFG-26 (1.19 mg/100g), while the maximum zinc content was found in PFG- 28 (4.43 mg/100g). The highest content of copper was found in PFG-26 (1.97 mg/100g). PFG-39 has the highest protein content (29.60 %). Significant differences were observed in the steroidal saponin among the genotypes. Saponin content ranged from 0.38 g/100g to 1.31 g/100g. Steroidal Saponins content was found the maximum in PFG-36 (1.31 g/100g) followed by PFG-17 (1.28 g/100g). Therefore, the genotypes which are rich in nutrient and oil content can be used for plant biofortification, dietary supplements, and herbal products.Keywords: genotypes, macronutrients, micronutrient, protein, seeds
Procedia PDF Downloads 25823087 Isolation Preserving Medical Conclusion Hold Structure via C5 Algorithm
Authors: Swati Kishor Zode, Rahul Ambekar
Abstract:
Data mining is the extraction of fascinating examples on the other hand information from enormous measure of information and choice is made as indicated by the applicable information extracted. As of late, with the dangerous advancement in internet, stockpiling of information and handling procedures, privacy preservation has been one of the major (higher) concerns in data mining. Various techniques and methods have been produced for protection saving data mining. In the situation of Clinical Decision Support System, the choice is to be made on the premise of the data separated from the remote servers by means of Internet to diagnose the patient. In this paper, the fundamental thought is to build the precision of Decision Support System for multiple diseases for different maladies and in addition protect persistent information while correspondence between Clinician side (Client side) also, the Server side. A privacy preserving protocol for clinical decision support network is proposed so that patients information dependably stay scrambled amid diagnose prepare by looking after the accuracy. To enhance the precision of Decision Support System for various malady C5.0 classifiers and to save security, a Homomorphism encryption algorithm Paillier cryptosystem is being utilized.Keywords: classification, homomorphic encryption, clinical decision support, privacy
Procedia PDF Downloads 33223086 Framework to Quantify Customer Experience
Authors: Anant Sharma, Ashwin Rajan
Abstract:
Customer experience is measured today based on defining a set of metrics and KPIs, setting up thresholds and defining triggers across those thresholds. While this is an effective way of measuring against a Key Performance Indicator ( referred to as KPI in the rest of the paper ), this approach cannot capture the various nuances that make up the overall customer experience. Customers consume a product or service at various levels, which is not reflected in metrics like Customer Satisfaction or Net Promoter Score, but also across other measurements like recurring revenue, frequency of service usage, e-learning and depth of usage. Here we explore an alternative method of measuring customer experience by flipping the traditional views. Rather than rolling customers up to a metric, we roll up metrics to hierarchies and then measure customer experience. This method allows any team to quantify customer experience across multiple touchpoints in a customer’s journey. We make use of various data sources which contain information for metrics like CXSAT, NPS, Renewals, and depths of service usage collected across a customer lifecycle. This data can be mined systematically to get linkages between different data points like geographies, business groups, products and time. Additional views can be generated by blending synthetic contexts into the data to show trends and top/bottom types of reports. We have created a framework that allows us to measure customer experience using the above logic.Keywords: analytics, customers experience, BI, business operations, KPIs, metrics
Procedia PDF Downloads 7823085 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement
Authors: Rhadinia Tayag-Relanes, Felina C. Young
Abstract:
This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.Keywords: continuous improvement, process, operations, PDCA
Procedia PDF Downloads 7723084 Influencers of E-Learning Readiness among Palestinian Secondary School Teachers: An Explorative Study
Authors: Fuad A. A. Trayek, Tunku Badariah Tunku Ahmad, Mohamad Sahari Nordin, Mohammed AM Dwikat
Abstract:
This paper reports on the results of an exploratory factor analysis procedure applied on the e-learning readiness data obtained from a survey of four hundred and seventy-nine (N = 479) teachers from secondary schools in Nablus, Palestine. The data were drawn from a 23-item Likert questionnaire measuring e-learning readiness based on Chapnick's conception of the construct. Principal axis factoring (PAF) with Promax rotation applied on the data extracted four distinct factors supporting four of Chapnick's e-learning readiness dimensions, namely technological readiness, psychological readiness, infrastructure readiness and equipment readiness. Together these four dimensions explained 56% of the variance. These findings provide further support for the construct validity of the items and for the existence of these four factors that measure e-learning readiness.Keywords: e-learning, e-learning readiness, technological readiness, psychological readiness, principal axis factoring
Procedia PDF Downloads 40223083 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 35223082 Formulation and Anticancer Evaluation of Beta-Sitosterol in Henna Methanolic Extract Embedded in Controlled Release Nanocomposite
Authors: Sanjukta Badhai, Durga Barik, Bairagi C. Mallick
Abstract:
In the present study, Beta-Sitosterol in Lawsonia methanolic leaf extract embedded in controlled release nanocomposite was prepared and evaluated for in vivo anticancer efficacy in dimethyl hydrazine (DMH) induced colon cancer. In the present study, colon cancer was induced by s.c injection of DMH (20 mg/kg b.wt) for 15 weeks. The animals were divided into five groups as follows control, DMH alone, DMH and Beta Sitosterol nanocomposite (50mg/kg), DMH and Beta Sitosterol nanocomposite (100 mg/kg) and DMH and Standard Silymarin (100mg/kg) and the treatment was carried out for 15 weeks. At the end of the study period, the blood was withdrawn, and serum was separated for haematological, biochemical analysis and tumor markers. Further, the colonic tissue was removed for the estimation of antioxidants and histopathological analysis. The results of the study displays that DMH intoxication elicits altered haematological parameters (RBC,WBC, and Hb), elevated lipid peroxidation and decreased antioxidants level (SOD, CAT, GPX, GST and GSH), elevated lipid profiles (cholesterol and triglycerides), tumor markers (CEA and AFP) and altered colonic tissue histology. Meanwhile, treatment with Beta Sitosterol nanocomposites significantly restored the altered biochemicals parameters in DMH induced colon cancer mediated by its anticancer efficacy. Further, Beta Sitosterol nanocomposite (100 mg/kg) showed marked efficacy.Keywords: nanocomposites, herbal formulation, henna, beta sitosterol, colon cancer, dimethyl hydrazine, antioxidant, lipid peroxidation
Procedia PDF Downloads 16623081 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 30523080 Teaching Translation during Covid-19 Outbreak: Challenges and Discoveries
Authors: Rafat Alwazna
Abstract:
Translation teaching is a particular activity that includes translators and interpreters training either inside or outside institutionalised settings, such as universities. It can also serve as a means of teaching other fields, such as foreign languages. Translation teaching began in the twentieth century. Teachers of translation hold the responsibilities of educating students, developing their translation competence and training them to be professional translators. The activity of translation teaching involves various tasks, including curriculum design, course delivery, material writing as well as application and implementation. The present paper addresses translation teaching during COVID-19 outbreak, seeking to find out the challenges encountered by translation teachers in online translation teaching and the discoveries/solutions arrived at to resolve them. The paper makes use of a comprehensive questionnaire, containing closed-ended and open-ended questions to elicit both quantitative as well as qualitative data from about sixty translation teachers who have been teaching translation at BA and MA levels during COVID-19 outbreak. The data shows that about 40% of the participants evaluate their online translation teaching experience during COVID-19 outbreak as enjoyable and exhilarating. On the contrary, no participant has evaluated his/her online translation teaching experience as being not good, nor has any participant evaluated his/her online translation teaching experience as being terrible. The data also presents that about 23.33% of the participants evaluate their online translation teaching experience as very good, and the same percentage applies to those who evaluate their online translation teaching experience as good to some extent. Moreover, the data indicates that around 13.33% of the participants evaluate their online translation teaching experience as good. The data also demonstrates that the majority of the participants have encountered obstacles in online translation teaching and have concurrently proposed solutions to resolve them.Keywords: online translation teaching, electronic learning platform, COVID-19 outbreak, challenges, solutions
Procedia PDF Downloads 22623079 Load Forecasting Using Neural Network Integrated with Economic Dispatch Problem
Authors: Mariyam Arif, Ye Liu, Israr Ul Haq, Ahsan Ashfaq
Abstract:
High cost of fossil fuels and intensifying installations of alternate energy generation sources are intimidating main challenges in power systems. Making accurate load forecasting an important and challenging task for optimal energy planning and management at both distribution and generation side. There are many techniques to forecast load but each technique comes with its own limitation and requires data to accurately predict the forecast load. Artificial Neural Network (ANN) is one such technique to efficiently forecast the load. Comparison between two different ranges of input datasets has been applied to dynamic ANN technique using MATLAB Neural Network Toolbox. It has been observed that selection of input data on training of a network has significant effects on forecasted results. Day-wise input data forecasted the load accurately as compared to year-wise input data. The forecasted load is then distributed among the six generators by using the linear programming to get the optimal point of generation. The algorithm is then verified by comparing the results of each generator with their respective generation limits.Keywords: artificial neural networks, demand-side management, economic dispatch, linear programming, power generation dispatch
Procedia PDF Downloads 19123078 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 3523077 Digital Reconstruction of the Cultural Landscape: Chengde Summer Resort as a Case Study
Authors: Jingsen Lian, Steffen Nijhuis, Gregory Bracken, Kai Lan
Abstract:
This study explores the digital reconstruction of the Chengde Mountain Resort (CMR), a UNESCO World Heritage Site recognized for its cultural landscape significance. Using mixed methods, the research combines spatial, textual, and graphical data to reconstruct the historical evolution of CMR's landscape across four phases from 1704 to the present. Data acquisition includes 3D point clouds, historical maps, traditional paintings, poetry, land-use records, academic papers, engineering drawings, and old photographs. Interdisciplinary techniques such as georectification, 3D modeling, and textual analysis were employed to integrate these diverse datasets into a cohesive Web-GIS platform. The reconstructed data illustrates dynamic landscape changes, reflecting shifting cultural and ecological priorities. The Web-GIS platform facilitates data visualization, querying, and customization, serving multiple stakeholders, including researchers, government planners, and local communities. This study underscores the value of digital tools in cultural heritage preservation, offering a model for adaptive and participatory management of historical sites while promoting open access and stakeholder engagement.Keywords: landscape mapping, cultural landscape, heritage, case study, mixed methods
Procedia PDF Downloads 623076 Problems and Challenges in Social Economic Research after COVID-19: The Case Study of Province Sindh
Authors: Waleed Baloch
Abstract:
This paper investigates the problems and challenges in social-economic research in the case study of the province of Sindh after the COVID-19 pandemic; the pandemic has significantly impacted various aspects of society and the economy, necessitating a thorough examination of the resulting implications. The study also investigates potential strategies and solutions to mitigate these challenges, ensuring the continuation of robust social and economic research in the region. Through an in-depth analysis of data and interviews with key stakeholders, the study reveals several significant findings. Firstly, researchers encountered difficulties in accessing primary data due to disruptions caused by the pandemic, leading to limitations in the scope and accuracy of their studies. Secondly, the study highlights the challenges faced in conducting fieldwork, such as restrictions on travel and face-to-face interactions, which impacted the ability to gather reliable data. Lastly, the research identifies the need for innovative research methodologies and digital tools to adapt to the new research landscape brought about by the pandemic. The study concludes by proposing recommendations to address these challenges, including utilizing remote data collection methods, leveraging digital technologies for data analysis, and establishing collaborations among researchers to overcome resource constraints. By addressing these issues, researchers in the social economic field can effectively navigate the post-COVID-19 research landscape, facilitating a deeper understanding of the socioeconomic impacts and facilitating evidence-based policy interventions.Keywords: social economic, sociology, developing economies, COVID-19
Procedia PDF Downloads 6523075 Smart Meter Incorporating UWB Technology
Authors: T. A. Khan, A. B. Khan, M. Babar, T. A. Taj, Imran Ijaz Imran
Abstract:
Smart Meter is a key element in the evolving concept of Smart Grid, which plays an important role in interaction between the consumer and the supplier. In general, the smart meter is an intelligent digital energy meter that measures the consumption of electrical energy and provides other additional services as compared to the conventional energy meters. One of the important element that makes a meter smart and different is its communication module. Smart meters usually have two way and real-time communication between the consumer and the supplier through which its transfer data and information. In this paper, Ultra Wide Band (UWB) is recommended as communication platform because of its high data-rate and presents the physical layer, which could be easily incorporated in existing Smart Meters. The physical layer is simulated in MATLAB Simulink and the results are provided.Keywords: Ultra Wide Band (UWB), Smart Meter, MATLAB, transfer data
Procedia PDF Downloads 52023074 Qualitative Approaches to Mindfulness Meditation Practices in Higher Education
Authors: Patrizia Barroero, Saliha Yagoubi
Abstract:
Mindfulness meditation practices in the context of higher education are becoming more and more common. Some of the reported benefits of mediation interventions and workshops include: improved focus, general well-being, diminished stress, and even increased resilience and grit. A series of workshops free to students, faculty, and staff was offered twice a week over two semesters at Hudson County Community College, New Jersey. The results of an exploratory study based on participants’ subjective reactions to these workshops will be presented. A qualitative approach was used to collect and analyze the data and a hermeneutic phenomenological perspective served as a framework for the research design and data collection and analysis. The data collected includes three recorded videos of semi-structured interviews and several written surveys submitted by volunteer participants.Keywords: mindfulness meditation practices, stress reduction, resilience, grit, higher education success, qualitative research
Procedia PDF Downloads 7823073 Integrated Nested Laplace Approximations For Quantile Regression
Authors: Kajingulu Malandala, Ranganai Edmore
Abstract:
The asymmetric Laplace distribution (ADL) is commonly used as the likelihood function of the Bayesian quantile regression, and it offers different families of likelihood method for quantile regression. Notwithstanding their popularity and practicality, ADL is not smooth and thus making it difficult to maximize its likelihood. Furthermore, Bayesian inference is time consuming and the selection of likelihood may mislead the inference, as the Bayes theorem does not automatically establish the posterior inference. Furthermore, ADL does not account for greater skewness and Kurtosis. This paper develops a new aspect of quantile regression approach for count data based on inverse of the cumulative density function of the Poisson, binomial and Delaporte distributions using the integrated nested Laplace Approximations. Our result validates the benefit of using the integrated nested Laplace Approximations and support the approach for count data.Keywords: quantile regression, Delaporte distribution, count data, integrated nested Laplace approximation
Procedia PDF Downloads 16823072 Assessing Moisture Adequacy over Semi-arid and Arid Indian Agricultural Farms using High-Resolution Thermography
Authors: Devansh Desai, Rahul Nigam
Abstract:
Crop water stress (W) at a given growth stage starts to set in as moisture availability (M) to roots falls below 75% of maximum. It has been found that ratio of crop evapotranspiration (ET) and reference evapotranspiration (ET0) is an indicator of moisture adequacy and is strongly correlated with ‘M’ and ‘W’. The spatial variability of ET0 is generally less over an agricultural farm of 1-5 ha than ET, which depends on both surface and atmospheric conditions, while the former depends only on atmospheric conditions. Solutions from surface energy balance (SEB) and thermal infrared (TIR) remote sensing are now known to estimate latent heat flux of ET. In the present study, ET and moisture adequacy index (MAI) (=ET/ET0) have been estimated over two contrasting western India agricultural farms having rice-wheat system in semi-arid climate and arid grassland system, limited by moisture availability. High-resolution multi-band TIR sensing observations at 65m from ECOSTRESS (ECOsystemSpaceborne Thermal Radiometer Experiment on Space Station) instrument on-board International Space Station (ISS) were used in an analytical SEB model, STIC (Surface Temperature Initiated Closure) to estimate ET and MAI. The ancillary variables used in the ET modeling and MAI estimation were land surface albedo, NDVI from close-by LANDSAT data at 30m spatial resolution, ET0 product at 4km spatial resolution from INSAT 3D, meteorological forcing variables from short-range weather forecast on air temperature and relative humidity from NWP model. Farm-scale ET estimates at 65m spatial resolution were found to show low RMSE of 16.6% to 17.5% with R2 >0.8 from 18 datasets as compared to reported errors (25 – 30%) from coarser-scale ET at 1 to 8 km spatial resolution when compared to in situ measurements from eddy covariance systems. The MAI was found to show lower (<0.25) and higher (>0.5) magnitudes in the contrasting agricultural farms. The study showed the potential need of high-resolution high-repeat spaceborne multi-band TIR payloads alongwith optical payload in estimating farm-scale ET and MAI for estimating consumptive water use and water stress. A set of future high-resolution multi-band TIR sensors are planned on-board Indo-French TRISHNA, ESA’s LSTM, NASA’s SBG space-borne missions to address sustainable irrigation water management at farm-scale to improve crop water productivity. These will provide precise and fundamental variables of surface energy balance such as LST (Land Surface Temperature), surface emissivity, albedo and NDVI. A synchronization among these missions is needed in terms of observations, algorithms, product definitions, calibration-validation experiments and downstream applications to maximize the potential benefits.Keywords: thermal remote sensing, land surface temperature, crop water stress, evapotranspiration
Procedia PDF Downloads 7323071 Measuring Flood Risk concerning with the Flood Protection Embankment in Big Flooding Events of Dhaka Metropolitan Zone
Authors: Marju Ben Sayed, Shigeko Haruyama
Abstract:
Among all kinds of natural disaster, the flood is a common feature in rapidly urbanizing Dhaka city. In this research, assessment of flood risk of Dhaka metropolitan area has been investigated by using an integrated approach of GIS, remote sensing and socio-economic data. The purpose of the study is to measure the flooding risk concerning with the flood protection embankment in big flooding events (1988, 1998 and 2004) and urbanization of Dhaka metropolitan zone. In this research, we considered the Dhaka city into two parts; East Dhaka (outside the flood protection embankment) and West Dhaka (inside the flood protection embankment). Using statistical data, we explored the socio-economic status of the study area population by comparing the density of population, land price and income level. We have drawn the cross section profile of the flood protection embankment into three different points for realizing the flooding risk in the study area, especially in the big flooding year (1988, 1998 and 2004). According to the physical condition of the study area, the land use/land cover map has been classified into five classes. Comparing with each land cover unit, historical weather station data and the socio-economic data, the flooding risk has been evaluated. Moreover, we compared between DEM data and each land cover units to find out the relationship with flood. It is expected that, this study could contribute to effective flood forecasting, relief and emergency management for a future flood event in Dhaka city.Keywords: land use, land cover change, socio-economic, Dhaka city, GIS, flood
Procedia PDF Downloads 29923070 Iterative Method for Lung Tumor Localization in 4D CT
Authors: Sarah K. Hagi, Majdi Alnowaimi
Abstract:
In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.Keywords: automated algorithm , computed tomography, lung tumor, tumor localization
Procedia PDF Downloads 60723069 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression
Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras
Abstract:
In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression
Procedia PDF Downloads 12523068 Re-Constructing the Research Design: Dealing with Problems and Re-Establishing the Method in User-Centered Research
Authors: Kerem Rızvanoğlu, Serhat Güney, Emre Kızılkaya, Betül Aydoğan, Ayşegül Boyalı, Onurcan Güden
Abstract:
This study addresses the re-construction and implementation process of the methodological framework developed to evaluate how locative media applications accompany the urban experiences of international students coming to Istanbul with exchange programs in 2022. The research design was built on a three-stage model. The research team conducted a qualitative questionnaire in the first stage to gain exploratory data. These data were then used to form three persona groups representing the sample by applying cluster analysis. In the second phase, a semi-structured digital diary study was carried out on a gamified task list with a sample selected from the persona groups. This stage proved to be the most difficult to obtaining valid data from the participant group. The research team re-evaluated the design of this second phase to reach the participants who will perform the tasks given by the research team while sharing their momentary city experiences, to ensure the daily data flow for two weeks, and to increase the quality of the obtained data. The final stage, which follows to elaborate on the findings, is the “Walk & Talk,” which is completed with face-to-face and in-depth interviews. It has been seen that the multiple methods used in the research process contribute to the depth and data diversity of the research conducted in the context of urban experience and locative technologies. In addition, by adapting the research design to the experiences of the users included in the sample, the differences and similarities between the initial research design and the research applied are shown.Keywords: digital diary study, gamification, multi-model research, persona analysis, research design for urban experience, user-centered research, “Walk & Talk”
Procedia PDF Downloads 17323067 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe
Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis
Abstract:
The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM
Procedia PDF Downloads 42923066 Pantograph-Catenary Contact Force: Features Evaluation for Catenary Diagnostics
Authors: Mehdi Brahimi, Kamal Medjaher, Noureddine Zerhouni, Mohammed Leouatni
Abstract:
The Prognostics and Health Management is a system engineering discipline which provides solutions and models to the implantation of a predictive maintenance. The approach is based on extracting useful information from monitoring data to assess the “health” state of an industrial equipment or an asset. In this paper, we examine multiple extracted features from Pantograph-Catenary contact force in order to select the most relevant ones to achieve a diagnostics function. The feature extraction methodology is based on simulation data generated thanks to a Pantograph-Catenary simulation software called INPAC and measurement data. The feature extraction method is based on both statistical and signal processing analyses. The feature selection method is based on statistical criteria.Keywords: catenary/pantograph interaction, diagnostics, Prognostics and Health Management (PHM), quality of current collection
Procedia PDF Downloads 29223065 Ameliorative Effect of Martynia annua Linn. on Collagen-Induced Arthritis via Modulating Cytokines and Oxidative Stress in Mice
Authors: Alok Pal Jain, Santram Lodhi
Abstract:
Martynia annua Linn. (Martyniaccae) is traditionally used in inflammation and applied locally to tuberculosis glands of camel’s neck. The leaves used topically to bites of venomous insects and wounds of domestic animals. Chemical examination of Martynia annua leaves revealed the presence of glycosides, tannins, proteins, phenols and flavonoids. The present study was aimed to evaluate the anti-arthritic activity of methanolic extract of Martynia annua leaves. Methanolic extract of Martynia annua leaves was tested by using in vivo collagen-induced arthritis mouse model to investigate the anti-rheumatoid arthritis activity. In addition, antioxidant effect of methanolic extract was determined by the estimation of antioxidants level in joint tissues. The severity of arthritis was assessed by arthritis score and edema. Levels of cytokines TNF-α and IL-6, in the joint tissue homogenate were measured using ELISA. A high dose (250 mg/kg) of methanolic extract was significantly reduced the degree of inflammation in mice as compared with reference drug. Antioxidants level and malondialdehyde (MDA) in joint tissue homogenate found significantly (p < 0.05) higher. Methanolic extract at dose of 250 mg/kg modulated the cytokines production and suppressed the oxidative stress in the mice with collagen-induced arthritis. This study suggested that Martynia annua might be alternative herbal medicine for the management of rheumatoid arthritis.Keywords: Martynia annua, collagen, rheumatoid arthritis, antioxidants
Procedia PDF Downloads 29823064 GRABTAXI: A Taxi Revolution in Thailand
Authors: Danuvasin Charoen
Abstract:
The study investigates the business process and business model of GRABTAXI. The paper also discusses how the company implemented strategies to gain competitive advantages. The data is derived from the analysis of secondary data and the in-depth interviews among staffs, taxi drivers, and key customers. The findings indicated that the company’s competitive advantages come from being the first mover, emphasising on the ease of use and tangible benefits of application, and using network effect strategy.Keywords: taxi, mobile application, innovative business model, Thailand
Procedia PDF Downloads 301