Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26680

Search results for: continuous data

26170 Effect of High Intensity Interval Training and Moderate Interval Continuous Training on Cardiovascular Endurance In young Healthy Female

Authors: Sidra Majeed, Irum Ali, Aroosa Ishfaq, Munazzah Parveen

Abstract:

Objectives: The objective is to compare the effects of high-intensity interval training VS moderate moderate-intensity continuous training on cardiovascular endurance in young healthy females. Method: 30 young, healthy females were collected and randomly assigned into two training groups, HIIT and MICT, each group having a sample size of (n=15). There will be three parameters to be tested, including (VO2max, Resting heart rate, and Rate perceived exertion). Each group will be tested at three different times, e.g. (at Baseline measurement, after two weeks and after four weeks). For the first two weeks, the HIIT group has to perform at 70%HRR and for the third and fourth weeks, at 75%HRR for two minutes, followed by an active resting interval at 30%HRR for two minutes (1:1) with warm-up and cool-down period (2 minutes each period ) on the treadmill. For the first two weeks, the MICT group has to perform at 40%HRR and for the third and fourth weeks at 50% HRR for fifteen minutes continuously on the treadmill, including warm up and cool down period (2 minutes each period). Result: The final assessment of HIIT and MICT groups had shown p values for VO2max (p=.000), RHR (p=.323) and for RPE (p=.085). These values indicating significant improvement in these three parameters in both groups. Conclusion: This study showed that there were significant improvements in both groups but there were more improvements in VO2max in HIIT group so, it is proved that HIIT is more beneficial than MICT in improving cardiovascular endurance.

Keywords: HIIT, MICT, RPE, RHR

Procedia PDF Downloads 45
26169 Religious Tourism the Core Strategy of Shaping Life Style: Evidences from Iran

Authors: Mostafa Jafari

Abstract:

Religious tourism is the core strategy of shaping Iranian's life-style. Why and How? This paper answers to this question. Theoretical base: From strategic marketing point of view, Life style is pattern of believes values, interests and acts. Strategy can be defined as a set of continuous important decisions. Here, strategy is making decisions about the target place and vehicle of touristic travel due to reform and redefine the self-identity and shaping life style. Methodology: Target society of this research is the selected residents of three provinces at northwest of Iran. The data collection instrument is interview and questionnaire and the collected data analysis by SEM (structural Equation Modeling) and LISREL software. Results: The primary results show that variety of touristic travels play an important role on shaping new life style of Iranian people. The target places of touristic travel (Europe, USA. Japan and etc.) are at the second priority. The number of foreign friends is at the third position. The fourth criteria are the number of travels. Among all kind of touristic travels the religious tourism from competitive point of view plays the main role. Findings: The geometry of Iranian life style are shaping and reshaping through some domestic and international tourism strategies particular religious strategy. During the dynamic trend of identity redefine, so many Iranians put the quantity and quality of their touristic travel on the first priority.

Keywords: religious tourism, core strategy, shaping life style

Procedia PDF Downloads 412
26168 Settlement Prediction in Cape Flats Sands Using Shear Wave Velocity – Penetration Resistance Correlations

Authors: Nanine Fouche

Abstract:

The Cape Flats is a low-lying sand-covered expanse of approximately 460 square kilometres, situated to the southeast of the central business district of Cape Town in the Western Cape of South Africa. The aeolian sands masking this area are often loose and compressible in the upper 1m to 1.5m of the surface, and there is a general exceedance of the maximum allowable settlement in these sands. The settlement of shallow foundations on Cape Flats sands is commonly predicted using the results of in-situ tests such as the SPT or DPSH due to the difficulty of retrieving undisturbed samples for laboratory testing. Varying degrees of accuracy and reliability are associated with these methods. More recently, shear wave velocity (Vs) profiles obtained from seismic testing, such as continuous surface wave tests (CSW), are being used for settlement prediction. Such predictions have the advantage of considering non-linear stress-strain behaviour of soil and the degradation of stiffness with increasing strain. CSW tests are rarely executed in the Cape Flats, whereas SPT’s are commonly performed. For this reason, and to facilitate better settlement predictions in Cape Flats sand, equations representing shear wave velocity (Vs) as a function of SPT blow count (N60) and vertical effective stress (v’) were generated by statistical regression of site investigation data. To reveal the most appropriate method of overburden correction, analyses were performed with a separate overburden term (Pa/σ’v) as well as using stress corrected shear wave velocity and SPT blow counts (correcting Vs. and N60 to Vs1and (N1)60respectively). Shear wave velocity profiles and SPT blow count data from three sites masked by Cape Flats sands were utilised to generate 80 Vs-SPT N data pairs for analysis. Investigated terrains included sites in the suburbs of Athlone, Muizenburg, and Atlantis, all underlain by windblown deposits comprising fine and medium sand with varying fines contents. Elastic settlement analysis was also undertaken for the Cape Flats sands, using a non-linear stepwise method based on small-strain stiffness estimates, which was obtained from the best Vs-N60 model and compared to settlement estimates using the general elastic solution with stiffness profiles determined using Stroud’s (1989) and Webb’s (1969) SPT N60-E transformation models. Stroud’s method considers strain level indirectly whereasWebb’smethod does not take account of the variation in elastic modulus with strain. The expression of Vs. in terms of N60 and Pa/σv’ derived from the Atlantis data set revealed the best fit with R2 = 0.83 and a standard error of 83.5m/s. Less accurate Vs-SPT N relations associated with the combined data set is presumably the result of inversion routines used in the analysis of the CSW results showcasing significant variation in relative density and stiffness with depth. The regression analyses revealed that the inclusion of a separate overburden term in the regression of Vs and N60, produces improved fits, as opposed to the stress corrected equations in which the R2 of the regression is notably lower. It is the correction of Vs and N60 to Vs1 and (N1)60 with empirical constants ‘n’ and ‘m’ prior to regression, that introduces bias with respect to overburden pressure. When comparing settlement prediction methods, both Stroud’s method (considering strain level indirectly) and the small strain stiffness method predict higher stiffnesses for medium dense and dense profiles than Webb’s method, which takes no account of strain level in the determination of soil stiffness. Webb’s method appears to be suitable for loose sands only. The Versak software appears to underestimate differences in settlement between square and strip footings of similar width. In conclusion, settlement analysis using small-strain stiffness data from the proposed Vs-N60 model for Cape Flats sands provides a way to take account of the non-linear stress-strain behaviour of the sands when calculating settlement.

Keywords: sands, settlement prediction, continuous surface wave test, small-strain stiffness, shear wave velocity, penetration resistance

Procedia PDF Downloads 175
26167 Study on Pedestrian Street Reconstruction under Comfortable Continuous View: Take the Walking Streets of Zhengzhou City as an Example

Authors: Liu Mingxin

Abstract:

Streets act as the organizers of each image element on the urban spatial route, and the spatial continuity of urban streets is the basis for people to perceive the overall image of the city. This paper takes the walking space of Zhengzhou city as the research object, conducts investigation and analysis through questionnaire interviews, and selects typical walking space for in-depth study. Through the analysis of questionnaire data, the investigation and analysis of the current situation of walking space, and the analysis of pedestrian psychological behavior activities, the paper summarizes the construction suggestions of urban walking space continuity from the three aspects of the composition of walking street, the bottom interface and side interface, and the service facilities of walking space. The walking space is not only the traffic space but also the comfortable experience and the continuity of the space.

Keywords: walking space, spatial continuity, walking psychology, space reconstruction

Procedia PDF Downloads 46
26166 Analysis in Mexico on Workers Performing Highly Repetitive Movements with Sensory Thermography in the Surface of the Wrist and Elbows

Authors: Sandra K. Enriquez, Claudia Camargo, Jesús E. Olguín, Juan A. López, German Galindo

Abstract:

Currently companies have increased the number of disorders of cumulative trauma (CTDs), these are increasing significantly due to the Highly Repetitive Movements (HRM) performed in workstations, which causes economic losses to businesses, due to temporary and permanent disabilities of workers. This analysis focuses on the prevention of disorders caused by: repeatability, duration and effort; And focuses on reducing cumulative trauma disorders such as occupational diseases using sensory thermography as a noninvasive method, the above is to evaluate the injuries could have workers to perform repetitive motions. Objectives: The aim is to define rest periods or job rotation before they generate a CTD, this sensory thermography by analyzing changes in temperature patterns on wrists and elbows when the worker is performing HRM over a period of time 2 hours and 30 minutes. Information on non-work variables such as wrist and elbow injuries, weight, gender, age, among others, and work variables such as temperature workspace, repetitiveness and duration also met. Methodology: The analysis to 4 industrial designers, 2 men and 2 women to be specific was conducted in a business in normal health for a period of 12 days, using the following time ranges: the first day for every 90 minutes continuous work were asked to rest 5 minutes, the second day for every 90 minutes of continuous work were asked to rest 10 minutes, the same to work 60 and 30 minutes straight. Each worker was tested with 6 different ranges at least twice. This analysis was performed in a controlled room temperature between 20 and 25 ° C, and a time to stabilize the temperature of the wrists and elbows than 20 minutes at the beginning and end of the analysis. Results: The range time of 90 minutes working continuous and a rest of 5 minutes of activity is where the maximum temperature (Tmax) was registered in the wrists and elbows in the office, we found the Tmax was 35.79 ° C with a difference of 2.79 ° C between the initial and final temperature of the left elbow presented at the individual 4 during the 86 minutes, in of range in 90 minutes continuously working and rested for 5 minutes of your activity. Conclusions: It is possible with this alternative technology is sensory thermography predict ranges of rotation or rest for the prevention of CTD to perform HRM work activities, obtaining with this reduce occupational disease, quotas by health agencies and increasing the quality of life of workers, taking this technology a cost-benefit acceptable in the future.

Keywords: sensory thermography, temperature, cumulative trauma disorder (CTD), highly repetitive movement (HRM)

Procedia PDF Downloads 429
26165 Selective Oxidation of 6Mn-2Si Advanced High Strength Steels during Intercritical Annealing Treatment

Authors: Maedeh Pourmajidian, Joseph R. McDermid

Abstract:

Advanced High Strength Steels are revolutionizing both the steel and automotive industries due to their high specific strength and ability to absorb energy during crash events. This allows manufacturers to design vehicles with significantly increased fuel efficiency without compromising passenger safety. To maintain the structural integrity of the fabricated parts, they must be protected from corrosion damage through continuous hot-dip galvanizing process, which is challenging due to selective oxidation of Mn and Si on the surface of this AHSSs. The effects of process atmosphere oxygen partial pressure and small additions of Sn on the selective oxidation of a medium-Mn C-6Mn-2Si advanced high strength steel was investigated. Intercritical annealing heat treatments were carried out at 690˚C in an N2-5%H2 process atmosphere under dew points ranging from –50˚C to +5˚C. Surface oxide chemistries, morphologies, and thicknesses were determined at a variety of length scales by several techniques, including SEM, TEM+EELS, and XPS. TEM observations of the sample cross-sections revealed the transition to internal oxidation at the +5˚C dew point. EELS results suggested that the internal oxides network was composed of a multi-layer oxide structure with varying chemistry from oxide core towards the outer part. The combined effect of employing a known surface active element as a function of process atmosphere on the surface structure development and the possible impact on reactive wetting of the steel substrates by the continuous galvanizing zinc bath will be discussed.

Keywords: 3G AHSS, hot-dip galvanizing, oxygen partial pressure, selective oxidation

Procedia PDF Downloads 398
26164 A Dynamic Curriculum as a Platform for Continuous Competence Development

Authors: Niina Jallinoja, Anu Moisio

Abstract:

Focus on adult learning is vital to overcome economic challenges as well as to respond to the demand for new competencies and sustained productivity in the digitalized world economy. Employees of all ages must be able to carry on continuous professional development to remain competitive in the labor market. According to EU policies, countries should offer more flexible opportunities for adult learners who study online and in so-called ‘second chance’ qualification programmes. Traditionally, adult education in Finland has comprised of not only liberal adult education but also the government funding to study for Bachelor, Master's, and Ph.D. degrees in Finnish Universities and Universities of Applied Sciences (UAS). From the beginning of 2021, public funding is allocated not only to degrees but also to courses to achieve new competencies for adult learners in Finland. Consequently, there will be degree students (often younger of age) and adult learners studying in the same evening, online and blended courses. The question is thus: How are combined studies meeting the different needs of degree students and adult learners? Haaga-Helia University of Applied Sciences (UAS), located in the metropolitan area of Finland, is taking up the challenge of continuous learning for adult learners. Haaga-Helia has been reforming the bachelor level education and respective shorter courses from 2019 in the biggest project in its history. By the end of 2023, Haaga-Helia will have a flexible, modular curriculum for the bachelor's degrees of hospitality management, business administration, business information technology, journalism and sports management. Building on the shared key competencies, degree students will have the possibility to build individual study paths more flexibly, thanks to the new modular structure of the curriculum. They will be able to choose courses across all degrees, and thus, build their own unique competence combinations. All modules can also be offered as separate courses or learning paths to non-degree students, both publicly funded and as commercial services for employers. Consequently, there will be shared course implementations for degree studies and adult learners with various competence requirements. The newly designed courses are piloted in parallel of the designing of the curriculum in Haaga-Helia during 2020 and 2021. Semi-structured online surveys are composed among the participants for the key competence courses. The focus of the research is to understand how students in the bachelor programme and adult learners from Open UAE perceive the learning experience in such a diverse learning group. A comparison is also executed between learning methods of in-site teaching, online implementation, blended learning and virtual self-learning courses to understand how the pedagogy is meeting the learning objectives of these two different groups. The new flexible curricula and the study modules are to be designed to fill the most important competence gaps that exist in the Finnish labor markets. The new curriculum will be dynamic and constantly evolving over time according to the future competence needs in the labor market. This type of approach requires constant dialogue between Haaga-Helia and workplaces during and after designing of the shared curriculum.

Keywords: ccompetence development, continuous learning, curriculum, higher education

Procedia PDF Downloads 127
26163 The Economic Limitations of Defining Data Ownership Rights

Authors: Kacper Tomasz Kröber-Mulawa

Abstract:

This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.

Keywords: antitrust, data, data ownership, digital economy, property rights

Procedia PDF Downloads 81
26162 Protecting the Cloud Computing Data Through the Data Backups

Authors: Abdullah Alsaeed

Abstract:

Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.

Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.

Procedia PDF Downloads 87
26161 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area

Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim

Abstract:

In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.

Keywords: data estimation, link data, machine learning, road network

Procedia PDF Downloads 510
26160 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies

Authors: Monica Lia

Abstract:

This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.

Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes

Procedia PDF Downloads 430
26159 Governance Structure of Islamic Philanthropic Institution: Analysis of Corporate WAQF in Malaysia

Authors: Nathasa Mazna Ramli, Nurul Husna Mohd Salleh, Nurul Aini Muhamed

Abstract:

This study focuses on the governance of an Islamic philanthropic institution in Malaysia. Specifically, the internal governance structure of corporate Islamic endowment, or waqf, is being analysed. The purposes of waqf are to provide continuous charity that could generate perpetual income flow for the needy. This study is based on the principle of MCCG 2012, Shariah Governance Framework and charity governance. This study utilises publicly available data to examine the internal governance structure of a corporate waqf. This study finds that the Islamic philanthropic Institution practices, to some extent, have a sound governance structure to discharge their transparency and accountability. Furthermore, findings also showed that though governance structure is in place, most of the structures are not disclosed in the annual reports of the company. Findings from the study could extend the knowledge in these areas and stimulate further research on the governance of Islamic philanthropic institutions, particularly for corporate waqf.

Keywords: accountability, governance, Islamic philanthropic, corporate waqf

Procedia PDF Downloads 566
26158 A Comparative Study of Various Control Methods for Rendezvous of a Satellite Couple

Authors: Hasan Basaran, Emre Unal

Abstract:

Formation flying of satellites is a mission that involves a relative position keeping of different satellites in the constellation. In this study, different control algorithms are compared with one another in terms of ΔV, velocity increment, and tracking error. Various control methods, covering continuous and impulsive approaches are implemented and tested for satellites flying in low Earth orbit. Feedback linearization, sliding mode control, and model predictive control are designed and compared with an impulsive feedback law, which is based on mean orbital elements. Feedback linearization and sliding mode control approaches have identical mathematical models that include second order Earth oblateness effects. The model predictive control, on the other hand, does not include any perturbations and assumes circular chief orbit. The comparison is done with 4 different initial errors and achieved with velocity increment, root mean square error, maximum steady state error, and settling time. It was observed that impulsive law consumed the least ΔV, while produced the highest maximum error in the steady state. The continuous control laws, however, consumed higher velocity increments and produced lower amounts of tracking errors. Finally, the inversely proportional relationship between tracking error and velocity increment was established.

Keywords: chief-deputy satellites, feedback linearization, follower-leader satellites, formation flight, fuel consumption, model predictive control, rendezvous, sliding mode

Procedia PDF Downloads 104
26157 Combining a Continuum of Hidden Regimes and a Heteroskedastic Three-Factor Model in Option Pricing

Authors: Rachid Belhachemi, Pierre Rostan, Alexandra Rostan

Abstract:

This paper develops a discrete-time option pricing model for index options. The model consists of two key ingredients. First, daily stock return innovations are driven by a continuous hidden threshold mixed skew-normal (HTSN) distribution which generates conditional non-normality that is needed to fit daily index return. The most important feature of the HTSN is the inclusion of a latent state variable with a continuum of states, unlike the traditional mixture distributions where the state variable is discrete with little number of states. The HTSN distribution belongs to the class of univariate probability distributions where parameters of the distribution capture the dependence between the variable of interest and the continuous latent state variable (the regime). The distribution has an interpretation in terms of a mixture distribution with time-varying mixing probabilities. It has been shown empirically that this distribution outperforms its main competitor, the mixed normal (MN) distribution, in terms of capturing the stylized facts known for stock returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence. Second, heteroscedasticity in the model is captured by a threeexogenous-factor GARCH model (GARCHX), where the factors are taken from the principal components analysis of various world indices and presents an application to option pricing. The factors of the GARCHX model are extracted from a matrix of world indices applying principal component analysis (PCA). The empirically determined factors are uncorrelated and represent truly different common components driving the returns. Both factors and the eight parameters inherent to the HTSN distribution aim at capturing the impact of the state of the economy on price levels since distribution parameters have economic interpretations in terms of conditional volatilities and correlations of the returns with the hidden continuous state. The PCA identifies statistically independent factors affecting the random evolution of a given pool of assets -in our paper a pool of international stock indices- and sorting them by order of relative importance. The PCA computes a historical cross asset covariance matrix and identifies principal components representing independent factors. In our paper, factors are used to calibrate the HTSN-GARCHX model and are ultimately responsible for the nature of the distribution of random variables being generated. We benchmark our model to the MN-GARCHX model following the same PCA methodology and the standard Black-Scholes model. We show that our model outperforms the benchmark in terms of RMSE in dollar losses for put and call options, which in turn outperforms the analytical Black-Scholes by capturing the stylized facts known for index returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence.

Keywords: continuous hidden threshold, factor models, GARCHX models, option pricing, risk-premium

Procedia PDF Downloads 297
26156 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 265
26155 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions

Authors: K. Hardy, A. Maurushat

Abstract:

Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.

Keywords: big data, open data, productivity, data governance

Procedia PDF Downloads 370
26154 Quality Teaching Evaluation Instrument: A Student Learning-centred Approach

Authors: Thuy T. T. Tran, Hamish Coates, Sophie Arkoudis

Abstract:

Evaluation instruments of teaching are abundant; however, these do not prompt any enhancement in the quality of teaching, not least because these instruments are framed only by teacher-centered conceptions of teaching. There is a need for more sophisticated teaching evaluation measures that focus on student learning and multi-stakeholder involvement. This study aims to develop such an evaluation instrument for Vietnamese higher education. The study uses several kinds of methods. The instrument was initially drafted through in-depth review of research, paying close attention to Vietnamese higher education. Draft evaluation instruments were produced and reviewed by 34 experts. The outcomes of this qualitative and quantitative data reveal an instrument that highlights the value of a multisource student-centered approach, and the rich integration of contextual and cultural traits where Confucian values are emphasized. The validation affirms that evaluating teaching in such way will facilitate the continuous learning growth of all stakeholders involved.

Keywords: multi stakeholders, quality teaching, student learning, teaching evaluation

Procedia PDF Downloads 310
26153 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 110
26152 A Systematic Review on Challenges in Big Data Environment

Authors: Rimmy Yadav, Anmol Preet Kaur

Abstract:

Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.

Keywords: big data, privacy, data management, network and energy consumption

Procedia PDF Downloads 311
26151 Survey on Big Data Stream Classification by Decision Tree

Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi

Abstract:

Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.

Keywords: big data, data streams, classification, decision tree

Procedia PDF Downloads 521
26150 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 381
26149 Lean Implementation in a Nurse Practitioner Led Pediatric Primary Care Clinic: A Case Study

Authors: Lily Farris, Chantel E. Canessa, Rena Heathcote, Susan Shumay, Suzanna V. McRae, Alissa Collingridge, Minna K. Miller

Abstract:

Objective: To describe how the Lean approach can be applied to improve access, quality and safety of care in an ambulatory pediatric primary care setting. Background: Lean was originally developed by Toyota manufacturing in Japan, and subsequently adapted for use in the healthcare sector. Lean is a systematic approach, focused on identifying and reducing waste within organizational processes, improving patient-centered care and efficiency. Limited literature is available on the implementation of the Lean methodologies in a pediatric ambulatory care setting. Methods: A strategic continuous improvement event or Rapid Process Improvement Workshop (RPIW) was launched with the aim evaluating and structurally supporting clinic workflow, capacity building, sustainability, and ultimately improving access to care and enhancing the patient experience. The Lean process consists of five specific activities: Current state/process assessment (value stream map); development of a future state map (value stream map after waste reduction); identification, quantification and prioritization of the process improvement opportunities; implementation and evaluation of process changes; and audits to sustain the gains. Staff engagement is a critical component of the Lean process. Results: Through the implementation of the RPIW and shifting workload among the administrative team, four hours of wasted time moving between desks and doing work was eliminated from the Administrative Clerks role. To streamline clinic flow, the Nursing Assistants completed patient measurements and vitals for Nurse Practitioners, reducing patient wait times and adding value to the patients visit with the Nurse Practitioners. Additionally, through the Nurse Practitioners engagement in the Lean processes a need was recognized to articulate clinic vision, mission and the alignment of NP role and scope of practice with the agency and Ministry of Health strategic plan. Conclusions: Continuous improvement work in the Pediatric Primary Care NP Clinic has provided a unique opportunity to improve the quality of care delivered and has facilitated further alignment of the daily continuous improvement work with the strategic priorities of the Ministry of Health.

Keywords: ambulatory care, lean, pediatric primary care, system efficiency

Procedia PDF Downloads 300
26148 A Review of Machine Learning for Big Data

Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.

Abstract:

Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.

Keywords: active learning, big data, deep learning, machine learning

Procedia PDF Downloads 445
26147 Managing Organizational Change for a Transformation Project: The Billing and Customer Relationship Management Journey

Authors: Sharifah I. N. A. Syed Azmi, Nazarina Mohd Nasir

Abstract:

The Billing & Customer Relationship Management (BCRM) project is an important enabler towards realizing customer experience transformation. It involves technological shifts for future scalability, revision of multiple business processes and adoption of change by the users and impacted employees. This massive transition, if not managed properly, may result in the decline of business performance due to productivity drop. Organizational change management is an essential element in BCRM project implementation to ensure the system is well understood and embraced by all stakeholders. In order to move impacted employees from unaware state or denial mode to full-acceptance mindset and committing themselves in using the new system, their involvement in the whole change process starting from the initial stage is imperative. Through the BCRM Change Management Plan, a holistic approach was taken whereby the strategy and program for five key components namely executive sponsorship, continuous communication, process change readiness, organizational readiness and individual readiness were all carefully established. Roles of the project sponsor, change agents, change ambassadors and community of practice (CoP) were clearly defined in gaining high commitment and support across the entire organization. Continuous communication and engagement initiatives throughout project implementation have been carried out to reach all stakeholders. The business readiness was constantly monitored and assessed including effectiveness of end-user training, thorough review of process documentation and completion of roles realignment exercise.

Keywords: BCRM, change management, organizational change, transformation project

Procedia PDF Downloads 141
26146 Patient Satisfaction Measurement Using Face-Q for Non-Incisional Double-Eyelid Blepharoplasty with Modified Single-Knot Continuous Buried Suture Technique

Authors: Kwei Huan Liw, Sashi B. Darshan

Abstract:

Background: Double eyelid surgery has become one of the most sought-after aesthetic procedures among Asians. Many surgeons perform surgical blepharoplasty and various other methods of non-incisional blepharoplasty. Face-Q is a validated method of measuring patient satisfaction for facial aesthetic procedures. Here we have analyzed the overall eye satisfaction score, the upper eyelid appraisal score and the adverse effect on eyes score Methods: 274 patients (548 eyes), aged between 18 to 40 years old, were recruited from 2015-2018. Each patient underwent a non-incisional double-eyelid blepharoplasty using a single-knotted continuous buried suture. 3 – 5 stab incisions were made depending on the upper eyelid size. A needle loaded with 7-0 nylon is passed from the lateral most wound through the dermis and the conjunctiva in an alternate fashion into the remaining stab wounds. The suture is then tunneled back laterally in the deeper dermis and knotted securely with the suture end. The knot is then buried within the orbicularis oculi muscle. Each patient was required to fill the Face-Q questionnaire before the procedure and 2 weeks post procedure. The results are described based on the percentage of the maximum achievable score. Patients were reviewed after 12 to 18 months to assess the long-term outcome. Results: The overall eye satisfaction score demonstrated a high level of post-operative satisfaction (97.85%), compared to 27.32% pre-operatively. The appraisal of upper eyelid scores showed drastic improvement in perception post-operatively (95.31%) compared to 21.44% pre-operatively. Adverse effect on eyes score showed a very low post-operative complication rate (0.4%) The long-term follow-up showed 6 cases that had developed asymmetrical folds. Only 1 patient agreed for revision surgery. The other 5 patients were still satisfied with the outcome and were not keen for revision surgery. None of the cases had loosening of knots. Conclusion: Modified single-knot continuous buried suture technique is a simple and non-invasive method to create aesthetically pleasing non-surgical double-eyelids, which has long-term effects. Proper patient selection is crucial and good surgical technique is required to achieve a desirable outcome.

Keywords: blepharoplasty, double-eyelid, face-Q, non-incisional

Procedia PDF Downloads 120
26145 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights

Authors: Tomy Prihananto, Damar Apri Sudarmadi

Abstract:

Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.

Keywords: Indonesia, protection, personal data, privacy, human rights, encryption

Procedia PDF Downloads 182
26144 Cadmium Concentrations in Breast Milk and Factors of Exposition: Systematic Review

Authors: Abha Cherkani Hassani, Imane Ghanname, Nezha Mouane

Abstract:

Background: This is the first systematic review summarizing 43 years of research from 36 countries in the assessment of cadmium in breast milk; a suitable matrix in human biomonitoring. Objectives: To report from the published literature the levels of cadmium in breast milk and the affecting factors causing the increase of cadmium concentrations; also to gather several quantitative data which might be useful to evaluate the international degrees of maternal and infant exposure. Methods: We reviewed the literature for studies reporting quantitative data about cadmium levels in human breast milk in the world that have been published between 1971 and 2014 and that are available on Pubmed, Science direct and Google scholar. The aim of the study, country, period of samples collection, size of samples, sampling method, time of lactation, mother’s age, area of residence, cadmium concentration and other information were extracted. Results: 67 studies were selected and included in this systematic review. Some concentrations greatly exceed the limit of the WHO, However about 50% of the studies had less than 1 µg/l cadmium concentration (the recommendation of the WHO); as well many factors have shown their implication in breast milk contamination by Cadmium as lactation stage, smoking, diet, supplement intake, interaction with other mineral elements, age of mothers, parity and other parameters. Conclusion: Breast milk is a pathway of maternal excretion of cadmium. It is also a biological indicator of the degree of environmental pollution and cadmium exposure of the lactating women and the nourished infant. Therefore preventive measures and continuous monitoring are necessary.

Keywords: breast milk, cadmium level, factors, systematic review

Procedia PDF Downloads 524
26143 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 65
26142 Evaluation of Liquid Fermentation Strategies to Obtain a Biofertilizer Based on Rhizobium sp.

Authors: Andres Diaz Garcia, Ana Maria Ceballos Rojas, Duvan Albeiro Millan Montano

Abstract:

This paper describes the initial technological development stages in the area of liquid fermentation required to reach the quantities of biomass of the biofertilizer microorganism Rhizobium sp. strain B02, for the application of the unitary stages downstream at laboratory scale. In the first stage, the adjustment and standardization of the fermentation process in conventional batch mode were carried out. In the second stage, various fed-batch and continuous fermentation strategies were evaluated in 10L-bioreactor in order to optimize the yields in concentration (Colony Forming Units/ml•h) and biomass (g/l•h), to make feasible the application of unit operations downstream of process. The growth kinetics, the evolution of dissolved oxygen and the pH profile generated in each of the strategies were monitored and used to make sequential adjustments. Once the fermentation was finished, the final concentration and viability of the obtained biomass were determined and performance parameters were calculated with the purpose of select the optimal operating conditions that significantly improved the baseline results. Under the conditions adjusted and standardized in batch mode, concentrations of 6.67E9 CFU/ml were reached after 27 hours of fermentation and a subsequent noticeable decrease was observed associated with a basification of the culture medium. By applying fed-batch and continuous strategies, significant increases in yields were achieved, but with similar concentration levels, which involved the design of several production scenarios based on the availability of equipment usage time and volume of required batch.

Keywords: biofertilizer, liquid fermentation, Rhizobium sp., standardization of processes

Procedia PDF Downloads 176
26141 Approximation by Generalized Lupaş-Durrmeyer Operators with Two Parameter α and β

Authors: Preeti Sharma

Abstract:

This paper deals with the Stancu type generalization of Lupaş-Durrmeyer operators. We establish some direct results in the polynomial weighted space of continuous functions defined on the interval [0, 1]. Also, Voronovskaja type theorem is studied.

Keywords: Lupas-Durrmeyer operators, polya distribution, weighted approximation, rate of convergence, modulus of continuity

Procedia PDF Downloads 345