Search results for: user attributes analysis
27145 Morphological Analysis of Manipuri Language: Wahei-Neinarol
Authors: Y. Bablu Singh, B. S. Purkayashtha, Chungkham Yashawanta Singh
Abstract:
Morphological analysis forms the basic foundation in NLP applications including syntax parsing Machine Translation (MT), Information Retrieval (IR) and automatic indexing in all languages. It is the field of the linguistics; it can provide valuable information for computer based linguistics task such as lemmatization and studies of internal structure of the words. Computational Morphology is the application of morphological rules in the field of computational linguistics, and it is the emerging area in AI, which studies the structure of words, which are formed by combining smaller units of linguistics information, called morphemes: the building blocks of words. Morphological analysis provides about semantic and syntactic role in a sentence. It analyzes the Manipuri word forms and produces several grammatical information associated with the words. The Morphological Analyzer for Manipuri has been tested on 3500 Manipuri words in Shakti Standard format (SSF) using Meitei Mayek as source; thereby an accuracy of 80% has been obtained on a manual check.Keywords: morphological analysis, machine translation, computational morphology, information retrieval, SSF
Procedia PDF Downloads 32727144 Channel Estimation Using Deep Learning for Reconfigurable Intelligent Surfaces-Assisted Millimeter Wave Systems
Authors: Ting Gao, Mingyue He
Abstract:
Reconfigurable intelligent surfaces (RISs) are expected to be an important part of next-generation wireless communication networks due to their potential to reduce the hardware cost and energy consumption of millimeter Wave (mmWave) massive multiple-input multiple-output (MIMO) technology. However, owing to the lack of signal processing abilities of the RIS, the perfect channel state information (CSI) in RIS-assisted communication systems is difficult to acquire. In this paper, the uplink channel estimation for mmWave systems with a hybrid active/passive RIS architecture is studied. Specifically, a deep learning-based estimation scheme is proposed to estimate the channel between the RIS and the user. In particular, the sparse structure of the mmWave channel is exploited to formulate the channel estimation as a sparse reconstruction problem. To this end, the proposed approach is derived to obtain the distribution of non-zero entries in a sparse channel. After that, the channel is reconstructed by utilizing the least-squares (LS) algorithm and compressed sensing (CS) theory. The simulation results demonstrate that the proposed channel estimation scheme is superior to existing solutions even in low signal-to-noise ratio (SNR) environments.Keywords: channel estimation, reconfigurable intelligent surface, wireless communication, deep learning
Procedia PDF Downloads 15627143 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 59227142 Error Analysis of English Inflection among Thai University Students
Authors: Suwaree Yordchim, Toby J. Gibbs
Abstract:
The linguistic competence of Thai university students majoring in Business English was examined in the context of knowledge of English language inflection, and also various linguistic elements. Errors analysis was applied to the results of the testing. Levels of errors in inflection, tense and linguistic elements were shown to be significantly high for all noun, verb and adjective inflections. Findings suggest that students do not gain linguistic competence in their use of English language inflection, because of interlanguage interference. Implications for curriculum reform and treatment of errors in the classroom are discussed.Keywords: interlanguage, error analysis, inflection, second language acquisition, Thai students
Procedia PDF Downloads 47027141 A Comparation Analysis of Islamic Bank Efficiency in the United Kingdom and Indonesia during Eurozone Crisis Using Data Envelopment Analysis
Authors: Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum, Achsania Hendratmi
Abstract:
The purpose of this study is to determine and comparing the level of efficiency of Islamic Banks in Indonesia and United Kingdom during eurozone sovereign debt crisis. This study using a quantitative non-parametric approach with Data Envelopment Analysis (DEA) VRS assumption, and a statistical tool Mann-Whitney U-Test. The samples are 11 Islamic Banks in Indonesia and 4 Islamic Banks in England. This research used mediating approach. Input variable consists of total deposit, asset, and the cost of labour. Output variable consists of financing and profit/loss. This study shows that the efficiency of Islamic Bank in Indonesia and United Kingdom are varied and fluctuated during the observation period. There is no significant different the efficiency performance of Islamic Banks in Indonesia and United Kingdom.Keywords: data envelopment analysis, efficiency, eurozone crisis, islamic bank
Procedia PDF Downloads 32727140 Hydroclean Smartbin Solution for Plastic Pollution Crisis
Authors: Anish Bhargava
Abstract:
By 2050, there will be more plastic than fish in our oceans. 51 trillion micro-plastics pollute our waters and contaminate the food on our plates, increasing the risk of tumours and diseases such as cancer. Our product is a solution to the ever-growing problem of plastic pollution. We call it the SmartBin. The SmartBin is a cylindrical device which will float just below the surface of the water, able to move with the aid of 4 water thrusters situated on the sides. As it floats, our SmartBin will suck water into itself and pump it out through the bottom. All waste is collected into a reusable filter including microplastics measuring down to 1.5mm. A speaker emitting sound at a frequency of 9 hertz ensures marine life stays away from the SmartBin. Featured along with our product is a smartphone app which will enable the user to designate an area for the SmartBin to cover on a satellite image. The SmartBin will then return to its start position near the shore, configured through the app. As global pressure to tackle water pollution continues to increase, environmental spending increases too. As our product provides an effective solution to this issue, we can seize the opportunity and scale our company. Our product is unparalleled. It can move at a high speed, covering a wide area rather than being restricted to one position. We target not only oceans and sea-shores, but also rivers, lakes, reservoirs and canals, as they are much easier to access and control.Keywords: water, plastic, pollution, solution, hydroclean, smartbin, cleanup
Procedia PDF Downloads 20927139 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects
Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta
Abstract:
Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect
Procedia PDF Downloads 21627138 Traditional Chinese Medicine Treatment for Coronary Heart Disease: a Meta-Analysis
Abstract:
Traditional Chinese medicine has been used in the treatment of coronary heart disease (CHD) for centuries, and in recent years, the research data on the efficacy of traditional Chinese medicine through clinical trials has gradually increased to explore its real efficacy and internal pharmacology. However, due to the complexity of traditional Chinese medicine prescriptions, the efficacy of each component is difficult to clarify, and pharmacological research is challenging. This study aims to systematically review and clarify the clinical efficacy of traditional Chinese medicine in the treatment of coronary heart disease through a meta-analysis. Based on PubMed, CNKI database, Wanfang data, and other databases, eleven randomized controlled trials and 1091 CHD subjects were included. Two researchers conducted a systematic review of the papers and conducted a meta-analysis supporting the positive therapeutic effect of traditional Chinese medicine in the treatment of CHD.Keywords: coronary heart disease, Chinese medicine, treatment, meta-analysis
Procedia PDF Downloads 12827137 Hand Motion Tracking as a Human Computer Interation for People with Cerebral Palsy
Authors: Ana Teixeira, Joao Orvalho
Abstract:
This paper describes experiments using Scratch games, to check the feasibility of employing cerebral palsy users gestures as an alternative of interaction with a computer carried out by students of Master Human Computer Interaction (HCI) of IPC Coimbra. The main focus of this work is to study the usability of a Web Camera as a motion tracking device to achieve a virtual human-computer interaction used by individuals with CP. An approach for Human-computer Interaction (HCI) is present, where individuals with cerebral palsy react and interact with a scratch game through the use of a webcam as an external interaction device. Motion tracking interaction is an emerging technology that is becoming more useful, effective and affordable. However, it raises new questions from the HCI viewpoint, for example, which environments are most suitable for interaction by users with disabilities. In our case, we put emphasis on the accessibility and usability aspects of such interaction devices to meet the special needs of people with disabilities, and specifically people with CP. Despite the fact that our work has just started, preliminary results show that, in general, computer vision interaction systems are very useful; in some cases, these systems are the only way by which some people can interact with a computer. The purpose of the experiments was to verify two hypothesis: 1) people with cerebral palsy can interact with a computer using their natural gestures, 2) scratch games can be a research tool in experiments with disabled young people. A game in Scratch with three levels is created to be played through the use of a webcam. This device permits the detection of certain key points of the user’s body, which allows to assume the head, arms and specially the hands as the most important aspects of recognition. Tests with 5 individuals of different age and gender were made throughout 3 days through periods of 30 minutes with each participant. For a more extensive and reliable statistical analysis, the number of both participants and repetitions in further investigations should be increased. However, already at this stage of research, it is possible to draw some conclusions. First, and the most important, is that simple scratch games on the computer can be a research tool that allows investigating the interaction with computer performed by young persons with CP using intentional gestures. Measurements performed with the assistance of games are attractive for young disabled users. The second important conclusion is that they are able to play scratch games using their gestures. Therefore, the proposed interaction method is promising for them as a human-computer interface. In the future, we plan to include the development of multimodal interfaces that combine various computer vision devices with other input devices improvements in the existing systems to accommodate more the special needs of individuals, in addition, to perform experiments on a larger number of participants.Keywords: motion tracking, cerebral palsy, rehabilitation, HCI
Procedia PDF Downloads 23727136 An Analysis of Telugu Proverbs in the Light of Endangerment
Abstract:
The main goal of this paper is to reflect on the overwhelming, rich folklore of Telugu people through their proverbs, which are assumed to be in a state of endangerment. In order to prove the statement made that the proverbs in Telugu are endangered, we have to delve deeper. We hardly found two or three papers related to Telugu proverbs. So, though the process was weary of sorting out the different proverbs in Telugu, to translate them etc. we found it necessary to do a survey in the form of a questionnaire and draw conclusions so that we could address this issue to the readers. We began with a basic assumption that the older generation may have a wider knowledge of their folklore when compared to the younger generation. The results obtained are quite remarkable, which strengthened our assumptions. Statistical analysis was adopted for quantitative analysis. Through this paper, we hope to kindle cultural awareness among the youngsters regarding the use of one’s own mother tongue.Keywords: sociolinguistics, Telugu proverbs, folklore, endangerment
Procedia PDF Downloads 21627135 Analysis of Vibratory Signals Based on Local Mean Decomposition (LMD) for Rolling Bearing Fault Diagnosis
Authors: Toufik Bensana, Medkour Mihoub, Slimane Mekhilef
Abstract:
The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally nonstationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA), and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.Keywords: fault diagnosis, rolling element bearing, local mean decomposition, condition monitoring
Procedia PDF Downloads 39227134 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis
Procedia PDF Downloads 18127133 Keyword Network Analysis on the Research Trends of Life-Long Education for People with Disabilities in Korea
Authors: Jakyoung Kim, Sungwook Jang
Abstract:
The purpose of this study is to examine the research trends of life-long education for people with disabilities using a keyword network analysis. For this purpose, 151 papers were selected from 594 papers retrieved using keywords such as 'people with disabilities' and 'life-long education' in the Korean Education and Research Information Service. The Keyword network analysis was constructed by extracting and coding the keyword used in the title of the selected papers. The frequency of the extracted keywords, the centrality of degree, and betweenness was analyzed by the keyword network. The results of the keyword network analysis are as follows. First, the main keywords that appeared frequently in the study of life-long education for people with disabilities were 'people with disabilities', 'life-long education', 'developmental disabilities', 'current situations', 'development'. The research trends of life-long education for people with disabilities are focused on the current status of the life-long education and the program development. Second, the keyword network analysis and visualization showed that the keywords with high frequency of occurrences also generally have high degree centrality and betweenness centrality. In terms of the keyword network diagram, it was confirmed that research trends of life-long education for people with disabilities are centered on six prominent keywords. Based on these results, it was discussed that life-long education for people with disabilities in the future needs to expand the subjects and the supporting areas of the life-long education, and the research needs to be further expanded into more detailed and specific areas.Keywords: life-long education, people with disabilities, research trends, keyword network analysis
Procedia PDF Downloads 34127132 Chemical, Biochemical and Sensory Evaluation of a Quadrimix Complementary Food Developed from Sorghum, Groundnut, Crayfish and Pawpaw Blends
Authors: Ogechi Nzeagwu, Assumpta Osuagwu, Charlse Nkwoala
Abstract:
Malnutrition in infants due to poverty, poor feeding practices, and high cost of commercial complementary foods among others is a concern in developing countries. The study evaluated the proximate, vitamin and mineral compositions, antinutrients and functional properties, biochemical, haematological and sensory evaluation of complementary food made from sorghum, groundnut, crayfish and paw-paw flour blends using standard procedures. The blends were formulated on protein requirement of infants (18 g/day) using Nutrisurvey linear programming software in ratio of sorghum(S), groundnut(G), crayfish(C) and pawpaw(P) flours as 50:25:10:15(SGCP1), 60:20:10:10 (SGCP2), 60:15:15:10 (SGCP3) and 60:10:20:10 (SGCP4). Plain-pap (fermented maize flour)(TCF) and cerelac (commercial complementary food) served as basal and control diets. Thirty weanling male albino rats aged 28-35 days weighing 33-60 g were purchased and used for the study. The rats after acclimatization were fed with gruel produced with the experimental diets and the control with water ad libitum daily for 35days. Effect of the blends on lipid profile, blood glucose, haematological (RBC, HB, PCV, MCV), liver and kidney function and weight gain of the rats were assessed. Acceptability of the gruel was conducted at the end of rat feeding on forty mothers of infants’ ≥ 6 months who gave their informed consent to participate using a 9 point hedonic scale. Data was analyzed for means and standard deviation, analysis of variance and means were separated using Duncan multiple range test and significance judged at 0.05, all using SPSS version 22.0. The results indicated that crude protein, fibre, ash and carbohydrate of the formulated diets were either comparable or higher than values in cerelac. The formulated diets (SGCP1- SGCP4) were significantly (P>0.05) higher in vitamin A and thiamin compared to cerelac. The iron content of the formulated diets SGCP1- SGCP4 (4.23-6.36 mg/100) were within the recommended iron intake of infants (0.55 mg/day). Phytate (1.56-2.55 mg/100g) and oxalate (0.23-0.35 mg/100g) contents of the formulated diets were within the permissible limits of 0-5%. In functional properties, bulk density, swelling index, % dispersibility and water absorption capacity significantly (P<0.05) increased and compared favourably with cerelac. The essential amino acids of the formulated blends were within the amino acid profile of the FAO/WHO/UNU reference protein for children 0.5 -2 years of age. Urea concentration of rats fed with SGCP1-SGCP4 (19.48 mmol/L),(23.76 mmol/L),(24.07 mmol/L),(23.65 mmol/L) respectively was significantly higher than that of rat fed cerelac (16.98 mmol/L); however, plain pap had the least value (9.15 mmol/L). Rats fed with SGCP1-SGCP4 (116 mg/dl), (119 mg/dl), (115 mg/dl), (117 mg/dl) respectively had significantly higher glucose levels those fed with cerelac (108 mg/dl). Liver function parameters (AST, ALP and ALT), lipid profile (triglyceride, HDL, LDL, VLDL) and hematological parameters of rats fed with formulated diets were within normal range. Rats fed SGCP1 gained more weight (90.45 g) than other rats fed with SGCP2-SGCP4 (71.65 g, 79.76 g, 75.68 g), TCF (20.13 g) and cerelac (59.06 g). In all the sensory attributes, the control was preferred with respect to the formulated diets. The formulated diets were generally adequate and may likely have potentials to meet nutrient requirements of infants as complementary food.Keywords: biochemical, chemical evaluation, complementary food, quadrimix
Procedia PDF Downloads 17527131 A Technique for Image Segmentation Using K-Means Clustering Classification
Authors: Sadia Basar, Naila Habib, Awais Adnan
Abstract:
The paper presents the Technique for Image Segmentation Using K-Means Clustering Classification. The presented algorithms were specific, however, missed the neighboring information and required high-speed computerized machines to run the segmentation algorithms. Clustering is the process of partitioning a group of data points into a small number of clusters. The proposed method is content-aware and feature extraction method which is able to run on low-end computerized machines, simple algorithm, required low-quality streaming, efficient and used for security purpose. It has the capability to highlight the boundary and the object. At first, the user enters the data in the representation of the input. Then in the next step, the digital image is converted into groups clusters. Clusters are divided into many regions. The same categories with same features of clusters are assembled within a group and different clusters are placed in other groups. Finally, the clusters are combined with respect to similar features and then represented in the form of segments. The clustered image depicts the clear representation of the digital image in order to highlight the regions and boundaries of the image. At last, the final image is presented in the form of segments. All colors of the image are separated in clusters.Keywords: clustering, image segmentation, K-means function, local and global minimum, region
Procedia PDF Downloads 37927130 Technological Measures to Reduce the Environmental Impact of Swimming Pools
Authors: Fátima Farinha, Miguel J. Oliveira, Gina Matias, Armando Inverno, Jânio Monteiro, Cristiano Cabrita
Abstract:
In the last decades, the construction of swimming pools for recreational activities has grown exponentially in southern Europe. Swimming pools are used both for private use in villas and for collective use in hotels or condominiums. However, they have a high environmental impact, mainly in terms of water and energy consumption, being used for a short period of time, depending significantly on favorable atmospheric conditions. Contrary to what would be expected, not enough research has been conducted to reduce the negative impact of this equipment. In this context, this work proposes and analyses technological measures to reduce the environmental impacts of swimming pools, such as thermal insulation of the tank, water balance in order to detect leaks and optimize the backwash process, integration of renewable energy generation, and a smart control system that meets the requirements of the user. The work was developed within the scope of the Ecopool+++ project, which aims to create innovative heated pools with reduced thermal losses and integration of SMART energy plus water management systems. The project is in the final phase of its development, with very encouraging results.Keywords: swimming pools, sustainability, thermal losses, water management system
Procedia PDF Downloads 10827129 Analysis of Bored Piles with and without Geogrid in a Selected Area in Kocaeli/Turkey
Authors: Utkan Mutman, Cihan Dirlik
Abstract:
Kocaeli/TURKEY district in which wastewater held in a chosen field increased property has made piling in order to improve the ground under the aeration basin. In this study, the degree of improvement the ground after bored piling held in the field were investigated. In this context, improving the ground before and after the investigation was carried out and that the solution values obtained by the finite element method analysis using Plaxis program have been made. The diffuses in the aeration basin whose treatment is to aide is influenced with and without geogrid on the ground. On the ground been improved, for the purpose of control of manufactured bored piles, pile continuity, and pile load tests were made. Taking into consideration both the data in the field as well as dynamic loads in the aeration basic, an analysis was made on Plaxis program and compared the data obtained from the analysis result and data obtained in the field.Keywords: geogrid, bored pile, soil improvement, plaxis
Procedia PDF Downloads 26927128 Detecting Model Financial Statement Fraud by Auditor Industry Specialization with Fraud Triangle Analysis
Authors: Reskino Resky
Abstract:
This research purposes to create a model to detecting financial statement fraud. This research examines the variable of fraud triangle and auditor industry specialization with financial statement fraud. This research used sample of company which is listed in Indonesian Stock Exchange that have sanctions and cases by Financial Services Authority in 2011-2013. The number of company that were became in this research were 30 fraud company and 30 non-fraud company. The method of determining the sample is by using purposive sampling method with judgement sampling, while the data processing methods used by researcher are mann-whitney u and discriminants analysis. This research have two from five variable that can be process with discriminant analysis. The result shows the financial targets can be detect financial statement fraud, while financial stability can’t be detect financial statement fraud.Keywords: fraud triangle analysis, financial targets, financial stability, auditor industry specialization, financial statement fraud
Procedia PDF Downloads 46327127 Automatic Segmentation of Lung Pleura Based On Curvature Analysis
Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.
Abstract:
Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).Keywords: curvature analysis, image segmentation, morphological operators, thresholding
Procedia PDF Downloads 59827126 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges
Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh
Abstract:
For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.Keywords: guideline, law, data protection officer, personal data
Procedia PDF Downloads 7927125 Sustainability of High-Rise Affordable Housing: Critical Issues in Applying Green Building Rating Tools
Authors: Poh Im. Lim, Hillary Yee Qin. Tan
Abstract:
Nowadays, going green has become a trend, and being emphasized in the construction industry. In Malaysia, there are several green rating tools available in the industry and among these, GBI and GreenRE are considered as the most common tools adopted for residential buildings. However, being green is not equal to or making something sustainable. Being sustainable is to take economic, environmental and social aspects into consideration. This is particularly essential in the affordable housing sector as the end-users belong to lower-income and places importance on many socio-economic needs beyond the environmental criteria. This paper discusses the arguments in proposing a sustainability framework that is tailor-made for high-rise affordable housing. In-depth interviews and observation mapping methods were used in gathering inputs from the end-users, non-governmental organisations (NGOs) as well as the professionals. ‘Bottom-up’ approach was applied in this research to show the significance of participation from the local community in the decision-making process. The proposed sustainability framework illustrates the discrepancies between user priorities and what the industry is providing. The outcome of this research suggests that integrating sustainability into high-rise affordable housing is achievable and beneficial to the industry, society, and the environment.Keywords: green building rating tools, high-rise affordable housing, sustainability framework, sustainable development
Procedia PDF Downloads 14327124 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network
Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang
Abstract:
As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.Keywords: GUI, deep learning, GAN, data augmentation
Procedia PDF Downloads 18727123 Using Emerging Hot Spot Analysis to Analyze Overall Effectiveness of Policing Policy and Strategy in Chicago
Authors: Tyler Gill, Sophia Daniels
Abstract:
The paper examines how accessing the spatial-temporal constrains of data will help inform policymakers and law enforcement officials. The authors utilize Chicago crime data from 2006-2016 to demonstrate how the Emerging Hot Spot Tool is an ideal hot spot clustering approach to analyze crime data. Traditional approaches include density maps or creating a spatial weights matrix to include the spatial-temporal constrains. This new approach utilizes a space-time implementation of the Getis-Ord Gi* statistic to visualize the data more quickly to make better decisions. The research will help complement socio-cultural research to find key patterns to help frame future policies and evaluate the implementation of prior strategies. Through this analysis, homicide trends and patterns are found more effectively and recommendations for use by non-traditional users of GIS are offered for real life implementation.Keywords: crime mapping, emerging hot spot analysis, Getis-Ord Gi*, spatial-temporal analysis
Procedia PDF Downloads 24727122 The Potential Benefits of Multimedia Information Representation in Enhancing Students’ Critical Thinking and History Reasoning
Authors: Ang Ling Weay, Mona Masood
Abstract:
This paper discusses the potential benefits of an interactive multimedia information representation in enhancing students’ critical thinking aligned with history reasoning in learning history between Secondary School students in Malaysia. Two modes of multimedia information representation implemented which are chronological and thematic information representation. A qualitative study of an unstructured interview was conducted among two history teachers, one history education lecturer, two i-think expert and program trainers and five form 4 secondary school students. The interview was to elicit their opinions on the implementation of thinking maps and interactive multimedia information representation in history learning. The key elements of interactive multimedia (e.g. multiple media, user control, interactivity, and use of timelines and concept maps) were then considered to improve the learning process. Findings of the preliminary investigation reveal that the interactive multimedia information representations have the potential benefits to be implemented as instructional resource in enhancing students’ higher order thinking skills (HOTs). This paper concludes by giving suggestions for future work.Keywords: multimedia information representation, critical thinking, history reasoning, chronological and thematic information representation
Procedia PDF Downloads 35227121 Using Risk Management Indicators in Decision Tree Analysis
Authors: Adel Ali Elshaibani
Abstract:
Risk management indicators augment the reporting infrastructure, particularly for the board and senior management, to identify, monitor, and manage risks. This enhancement facilitates improved decision-making throughout the banking organization. Decision tree analysis is a tool that visually outlines potential outcomes, costs, and consequences of complex decisions. It is particularly beneficial for analyzing quantitative data and making decisions based on numerical values. By calculating the expected value of each outcome, decision tree analysis can help assess the best course of action. In the context of banking, decision tree analysis can assist lenders in evaluating a customer’s creditworthiness, thereby preventing losses. However, applying these tools in developing countries may face several limitations, such as data availability, lack of technological infrastructure and resources, lack of skilled professionals, cultural factors, and cost. Moreover, decision trees can create overly complex models that do not generalize well to new data, known as overfitting. They can also be sensitive to small changes in the data, which can result in different tree structures and can become computationally expensive when dealing with large datasets. In conclusion, while risk management indicators and decision tree analysis are beneficial for decision-making in banks, their effectiveness is contingent upon how they are implemented and utilized by the board of directors, especially in the context of developing countries. It’s important to consider these limitations when planning to implement these tools in developing countries.Keywords: risk management indicators, decision tree analysis, developing countries, board of directors, bank performance, risk management strategy, banking institutions
Procedia PDF Downloads 6327120 Factors Influencing Site Overhead Cost of Construction Projects in Egypt: A Comparative Analysis
Authors: Aya Effat, Ossama A. Hosny, Elkhayam M. Dorra
Abstract:
Estimating costs is a crucial step in construction management and should be completed at the beginning of every project to establish the project's budget. The precision of the cost estimate plays a significant role in the success of construction projects as it allows project managers to effectively manage the project's costs. Site overhead costs constitute a significant portion of construction project budgets, necessitating accurate prediction and management. These costs are influenced by a multitude of factors, requiring a thorough examination and analysis to understand their relative importance and impact. Thus, the main aim of this research is to enhance the contractor’s ability to predict and manage site overheads by identifying and analyzing the main factors influencing the site overheads costs in the Egyptian construction industry. Through a comprehensive literature review, key factors were first identified and subsequently validated using a thorough comparative analysis of data from 55 real-life construction projects. Through this comparative analysis, the relationship between each factor and site overheads percentage as well as each site overheads subcategory and each project construction phase was identified and examined. Furthermore, correlation analysis was done to check for multicollinearity and identify factors with the highest impact. The findings of this research offer valuable insights into the key drivers of site overhead costs in the Egyptian construction industry. By understanding these factors, construction professionals can make informed decisions regarding the estimation and management of site overhead costs.Keywords: comparative analysis, cost estimation, construction management, site overheads
Procedia PDF Downloads 2727119 RAPD Analysis of Genetic Diversity of Castor Bean
Authors: M. Vivodík, Ž. Balážová, Z. Gálová
Abstract:
The aim of this work was to detect genetic variability among the set of 40 castor genotypes using 8 RAPD markers. Amplification of genomic DNA of 40 genotypes, using RAPD analysis, yielded in 66 fragments, with an average of 8.25 polymorphic fragments per primer. Number of amplified fragments ranged from 3 to 13, with the size of amplicons ranging from 100 to 1200 bp. Values of the polymorphic information content (PIC) value ranged from 0.556 to 0.895 with an average of 0.784 and diversity index (DI) value ranged from 0.621 to 0.896 with an average of 0.798. The dendrogram based on hierarchical cluster analysis using UPGMA algorithm was prepared and analyzed genotypes were grouped into two main clusters and only two genotypes could not be distinguished. Knowledge on the genetic diversity of castor can be used for future breeding programs for increased oil production for industrial uses.Keywords: dendrogram, polymorphism, RAPD technique, Ricinus communis L.
Procedia PDF Downloads 47527118 User-Friendly Task Creation Using a CAD Integrated Robotic System on a Real Workcell
Authors: Alireza Changizi, Arash Rezaei, Jamal Muhammad, Jyrki Latokartano, Minna Lanz
Abstract:
Offline programming (OLP) is a new method in robot programming which is used widely in the industry nowadays which is a simulation base method that can produce the robot codes for motion according to virtual world in the simulation software. In this project Delmia v5 is used as simulation software. First the work cell component was modelled by Catia v5 and all of them was imported to a process file in Delmia and placed roughly to form the virtual work cell. Then robot was added to the work cell from the Delmia library. Work cell was calibrated corresponding to real world work cell to have accurate code. Tool calibration is the first step of calibration scheme and then work cell equipment can be calibrated using 6 point calibration method. Finally generated code needs to be reformed to match related controller code instruction. At the last stage IO were set to accomplish robots cooperation and make their motion synchronized. The pros and cons also will be discussed to clarify the presented results show the feasibility of the method and its effect on production line efficiency. Finally the positive and negative points of the implementation will be discussed.Keywords: robotic, automated, production, offline programming, CAD
Procedia PDF Downloads 38827117 Linking Excellence in Biomedical Knowledge and Computational Intelligence Research for Personalized Management of Cardiovascular Diseases within Personal Health Care
Authors: T. Rocha, P. Carvalho, S. Paredes, J. Henriques, A. Bianchi, V. Traver, A. Martinez
Abstract:
The main goal of LINK project is to join competences in intelligent processing in order to create a research ecosystem to address two central scientific and technical challenges for personal health care (PHC) deployment: i) how to merge clinical evidence knowledge in computational decision support systems for PHC management and ii) how to provide achieve personalized services, i.e., solutions adapted to the specific user needs and characteristics. The final goal of one of the work packages (WP2), designated Sustainable Linking and Synergies for Excellence, is the definition, implementation and coordination of the necessary activities to create and to strengthen durable links between the LiNK partners. This work focuses on the strategy that has been followed to achieve the definition of the Research Tracks (RT), which will support a set of actions to be pursued along the LiNK project. These include common research activities, knowledge transfer among the researchers of the consortium, and PhD student and post-doc co-advisement. Moreover, the RTs will establish the basis for the definition of concepts and their evolution to project proposals.Keywords: LiNK Twin European Project, personal health care, cardiovascular diseases, research tracks
Procedia PDF Downloads 21727116 Genre Analysis of Postgraduate Theses and Dissertations: Case of Statement of the Problem
Authors: H. Mashhady, H. A. Manzoori, M. Doosti, M. Fatollahi
Abstract:
This study reports a descriptive research in the form of a genre analysis of postgraduates' theses and dissertations at three Iranian universities, including Ferdowsi, Tehran, and Tarbiat Moddares universities. The researchers sought to depict the generic structure of “statement of the problem” section of PhD dissertations and MA theses. Moreover, researchers desired to find any probable variety based on the year the dissertations belonged, to see weather genre-consciousness developed among Iranian postgraduates. To obtain data, “statement of the problem” section of 90 Ph.D. dissertations and MA theses from 2001 to 2013 in Teaching English as a Foreign Language (TEFL) at above-mentioned universities was selected. Frequency counts was employed for the quantitative method of data analysis, while genre analysis was used as the qualitative method. Inter-rater reliability was found to be about 0.93. Results revealed that students in different degrees at each of these universities used various generic structures for writing “statement of the problem”. Moreover, comparison of different time periods (2001-2006, and 2007-2013) revealed that postgraduates in the second time period, regardless of their degree and university, employed more similar generic structures which can be optimistically attributed to a general raise in genre awareness.Keywords: genre, genre analysis, Ph.D. and MA dissertations, statement of the problem, generic structure
Procedia PDF Downloads 670