Search results for: data source
26498 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act
Authors: Maria Jędrzejczak, Patryk Pieniążek
Abstract:
The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.Keywords: data protection law, personal data, AI law, personal data breach
Procedia PDF Downloads 6526497 A Method for Identifying Unusual Transactions in E-commerce Through Extended Data Flow Conformance Checking
Authors: Handie Pramana Putra, Ani Dijah Rahajoe
Abstract:
The proliferation of smart devices and advancements in mobile communication technologies have permeated various facets of life with the widespread influence of e-commerce. Detecting abnormal transactions holds paramount significance in this realm due to the potential for substantial financial losses. Moreover, the fusion of data flow and control flow assumes a critical role in the exploration of process modeling and data analysis, contributing significantly to the accuracy and security of business processes. This paper introduces an alternative approach to identify abnormal transactions through a model that integrates both data and control flows. Referred to as the Extended Data Petri net (DPNE), our model encapsulates the entire process, encompassing user login to the e-commerce platform and concluding with the payment stage, including the mobile transaction process. We scrutinize the model's structure, formulate an algorithm for detecting anomalies in pertinent data, and elucidate the rationale and efficacy of the comprehensive system model. A case study validates the responsive performance of each system component, demonstrating the system's adeptness in evaluating every activity within mobile transactions. Ultimately, the results of anomaly detection are derived through a thorough and comprehensive analysis.Keywords: database, data analysis, DPNE, extended data flow, e-commerce
Procedia PDF Downloads 5626496 Energy Options and Environmental Impacts of Carbon Dioxide Utilization Pathways
Authors: Evar C. Umeozor, Experience I. Nduagu, Ian D. Gates
Abstract:
The energy requirements of carbon dioxide utilization (CDU) technologies/processes are diverse, so also are their environmental footprints. This paper explores the energy and environmental impacts of systems for CO₂ conversion to fuels, chemicals, and materials. Energy needs of the technologies and processes deployable in CO₂ conversion systems are met by one or combinations of hydrogen (chemical), electricity, heat, and light. Likewise, the environmental footprint of any CO₂ utilization pathway depends on the systems involved. So far, evaluation of CDU systems has been constrained to particular energy source/type or a subset of the overall system needed to make CDU possible. This introduces limitations to the general understanding of the energy and environmental implications of CDU, which has led to various pitfalls in past studies. A CDU system has an energy source, CO₂ supply, and conversion units. We apply a holistic approach to consider the impacts of all components in the process, including various sources of energy, CO₂ feedstock, and conversion technologies. The electricity sources include nuclear power, renewables (wind and solar PV), gas turbine, and coal. Heat is supplied from either electricity or natural gas, and hydrogen is produced from either steam methane reforming or electrolysis. The CO₂ capture unit uses either direct air capture or post-combustion capture via amine scrubbing, where applicable, integrated configurations of the CDU system are explored. We demonstrate how the overall energy and environmental impacts of each utilization pathway are obtained by aggregating the values for all components involved. Proper accounting of the energy and emission intensities of CDU must incorporate total balances for the utilization process and differences in timescales between alternative conversion pathways. Our results highlight opportunities for the use of clean energy sources, direct air capture, and a number of promising CO₂ conversion pathways for producing methanol, ethanol, synfuel, urea, and polymer materials.Keywords: carbon dioxide utilization, processes, energy options, environmental impacts
Procedia PDF Downloads 14726495 Sumac Sprouts: From in Vitro Seed Germination to Chemical Characterization
Authors: Leto Leandra, Guaitini Caterina, Agosti Anna, Del Vecchio Lorenzo, Guarrasi Valeria, Cirlini Martina, Chiancone Benedetta
Abstract:
To the best of our knowledge, this study represents the first attempt to investigate the in vitro germination response of Rhus coriaria L., and its sprout chemical characterization. Rhus coriaria L., a species belonging to the Anacardiaceae family, is commonly called "sumac" and is cultivated, in different countries of the Mediterranean and the Middle East regions, to produce a spice with a sour taste, obtained from its dried and ground fruits. Moreover, since ancient times, many beneficial properties have been attributed to this plant that has been used, in the traditional medicine of several Asian countries, against various diseases, including liver and intestinal pathologies, ulcers and various inflammatory states. In the recent past, sumac was cultivated in the Southern regions of Italy to treat leather, but its cultivation was abandoned, and currently, sumac plants grow spontaneously in marginal areas. Recently, in Italy, the interest in this species has been growing again, thanks to its numerous properties; thus, it becomes imperative to deepen the knowledge of this plant. In this study, in order to set up an efficient in vitro seed germination protocol, sumac seeds collected from spontaneous plants grown in Sicily, an island in the South of Italy, were, firstly, subjected to different treatments, scarification (mechanical, physical and chemical), cold stratification and imbibition, to break their physical and physiological dormancy, then, treated seeds were in vitro cultured on media with different gibberellic acid (GA3) concentrations. Results showed that, without any treatment, only 5% of in vitro sown seeds germinated, while the germination percentage increased up to 19% after the mechanical scarification. A further significative improvement of germination percentages was recorded after the physical scarification, with (40.5%) or without (36.5%) 8 weeks of cold stratification, especially when seeds were sown on gibberellin enriched cultured media. Vitro-derived sumac sprouts, at different developmental stages, were chemically characterized, in terms of polyphenol and tannin content, as well as for their antioxidant activity, to evaluate this matrix as a potential novel food or as a source of bioactive compounds. Results evidenced how more developed sumac sprouts and, above all, their leaves are a wealthy source of polyphenols (78.4 GAE/g SS) and tannins (21.9 mg GAE/g SS), with marked antioxidant activity. The outcomes of this study will be of support the nursery sector and sumac growers in obtaining a higher number of plants in a shorter time; moreover, the sprout chemical characterization will contribute to the process of considering this matrix as a new source of bioactive compounds and tannins to be used in food and non-food sectors.Keywords: bioactive compounds, germination pre-treatments, rhus coriaria l., tissue culture
Procedia PDF Downloads 10426494 Sumac Sprouts: From in Vitro Seed Germination to Chemical Characterization
Authors: Leto Leandra, Guaitini Caterina, Agosti Anna, Del Vecchio Lorenzo, Guarrasi Valeria, Cirlini Martina, Chiancone Benedetta
Abstract:
To the best of our knowledge, this study represents the first attempt to investigate the in vitro germination response of Rhus coriaria L. and its sprout chemical characterization. Rhus coriaria L., a species belonging to the Anacardiaceae family, is commonly called "sumac” and is cultivated, in different countries of the Mediterranean and the Middle East regions, to produce a spice with a sour taste, obtained from its dried and ground fruits. Moreover, since ancient times, many beneficial properties have been attributed to this plant that has been used, in the traditional medicine of several Asian countries, against various diseases, including liver and intestinal pathologies, ulcers, and various inflammatory states. In the recent past, sumac was cultivated in the Southern regions of Italy to treat leather, but its cultivation was abandoned, and currently, sumac plants grow spontaneously in marginal areas. Recently, in Italy, the interest in this species has been growing again, thanks to its numerous properties; thus, it becomes imperative to deepen the knowledge of this plant. In this study, in order to set up an efficient in vitro seed germination protocol, sumac seeds collected from spontaneous plants grown in Sicily, an island in the South of Italy, were, firstly, subjected to different treatments, scarification (mechanical, physical and chemical), cold stratification and imbibition, to break their physical and physiological dormancy, then, treated seeds were in vitro cultured on media with different gibberellic acid (GA3) concentrations. Results showed that, without any treatment, only 5% of in vitro sown seeds germinated, while the germination percentage increased up to 19% after the mechanical scarification. A further significative improvement of germination percentages was recorded after the physical scarification, with (40.5%) or without (36.5%) 8 weeks of cold stratification, especially when seeds were sown on gibberellin enriched cultured media. Vitro-derived sumac sprouts, at different developmental stages, were chemically characterized, in terms of polyphenol and tannin content, as well as for their antioxidant activity, to evaluate this matrix as a potential novel food or as a source of bioactive compounds. Results evidenced how more developed sumac sprouts and, above all, their leaves are a wealthy source of polyphenols (78.4 GAE/g SS) and tannins (21.9 mg GAE/g SS), with marked antioxidant activity. The outcomes of this study will be of support the nursery sector and sumac growers in obtaining a higher number of plants in a shorter time; moreover, the sprout chemical characterization will contribute to the process of considering this matrix as a new source of bioactive compounds and tannins to be used in food and non-food sectors.Keywords: bioactive compounds, germination pre-treatments, rhus coriaria l., tissue culture
Procedia PDF Downloads 10126493 Advanced Analytical Competency Is Necessary for Strategic Leadership to Achieve High-Quality Decision-Making
Authors: Amal Mohammed Alqahatni
Abstract:
This paper is a non-empirical analysis of existing literature on digital leadership competency, data-driven organizations, and dealing with AI technology (big data). This paper will provide insights into the importance of developing the leader’s analytical skills and style to be more effective for high-quality decision-making in a data-driven organization and achieve creativity during the organization's transformation to be digitalized. Despite the enormous potential that big data has, there are not enough experts in the field. Many organizations faced an issue with leadership style, which was considered an obstacle to organizational improvement. It investigates the obstacles to leadership style in this context and the challenges leaders face in coaching and development. The leader's lack of analytical skill with AI technology, such as big data tools, was noticed, as was the lack of understanding of the value of that data, resulting in poor communication with others, especially in meetings when the decision should be made. By acknowledging the different dynamics of work competency and organizational structure and culture, organizations can make the necessary adjustments to best support their leaders. This paper reviews prior research studies and applies what is known to assist with current obstacles. This paper addresses how analytical leadership will assist in overcoming challenges in a data-driven organization's work environment.Keywords: digital leadership, big data, leadership style, digital leadership challenge
Procedia PDF Downloads 6926492 Folk Media and Political Movement: A Case Study on the Bodos of North East India
Authors: Faguna Barmahalia
Abstract:
Politics of ethnic identity in the north-east India is well-known phenomenon. The ethnic assertion in this region is mostly linguistic and cultural in nature. Most of the ethnic groups in the north-east region have been demanding either autonomous or separate state to maintain their socio-cultural identity. After the Indian Independence, the ethnic groups of people think that they have not developed till. Despite having many natural resources, North East India remained backward in terms of economic, education as well as politics. In this scenario, many educated and middle-class elite people have involved in working for the all-round development of their community. The Bodos are one of the major tribes in North Eeast India. In Assam, the Bodos are assumed by themselves to be exploited and suppressed by the Assamese Hindu society. Consequently, the socio-cultural identity movement has emerged among the Bodos.The main aims of my study are: i. to focus on how the Bodos of Assam are using the folk media in their political movement and iii. To analyse the role of folklore towards serving the ethnic unity and nationalism among the Bodos. Methodology: The study is based on the primary and secondary sources. Interview and observation method was conducted for collecting the primary data. For secondary source, some printed books, magazines and others materials published by the distinguished publishers and websites have been used.Keywords: media, culture, nationalism, politics
Procedia PDF Downloads 22226491 Analysis of Operating Speed on Four-Lane Divided Highways under Mixed Traffic Conditions
Authors: Chaitanya Varma, Arpan Mehar
Abstract:
The present study demonstrates the procedure to analyse speed data collected on various four-lane divided sections in India. Field data for the study was collected at different straight and curved sections on rural highways with the help of radar speed gun and video camera. The data collected at the sections were analysed and parameters pertain to speed distributions were estimated. The different statistical distribution was analysed on vehicle type speed data and for mixed traffic speed data. It was found that vehicle type speed data was either follows the normal distribution or Log-normal distribution, whereas the mixed traffic speed data follows more than one type of statistical distribution. The most common fit observed on mixed traffic speed data were Beta distribution and Weibull distribution. The separate operating speed model based on traffic and roadway geometric parameters were proposed in the present study. The operating speed model with traffic parameters and curve geometry parameters were established. Two different operating speed models were proposed with variables 1/R and Ln(R) and were found to be realistic with a different range of curve radius. The models developed in the present study are simple and realistic and can be used for forecasting operating speed on four-lane highways.Keywords: highway, mixed traffic flow, modeling, operating speed
Procedia PDF Downloads 46026490 Evaulation of Food Safety Management in Central Elementary School Canteens in Tuguegarao City, Philippines
Authors: Lea B. Milan
Abstract:
This descriptive study evaluated the existing Food Safety Management in Central Elementary School Canteens of Region 3. It made used of survey questionnaires, interview guide questions and validated knowledge test on food for data gathering. Results of the study revealed that school principals and canteen managers shared responsibilities in food safety management of school canteen. It also showed that the schools applied different methods of communication, monitoring and evaluation of food safety management. The study further revealed that implementation of monitoring and evaluation of food safety compliance are not being practiced in all elementary schools in the region. The study also showed that school canteens in the Region 3 do not have the thermometers and timers to use to conduct proper monitoring of foods during storage, preparation and serving. It was also found out from the study that canteen personnel lacks the basic knowledge and trainings on food safety. Potential source of physical, chemical and biological hazards that could contaminate foods were also found present in the canteen facilities of the elementary schools in the region. Moreover, evaluation showed that the existing implementation of food safety management in the Central Elementary School Canteens of Region 3 were below the expected level and the need to strengthen the appreciation and advocacy on food safety management in school canteens of Region 3 is still wanting.Keywords: food safety management, food safety school catering, food safety, school food safety management
Procedia PDF Downloads 37626489 Accurate HLA Typing at High-Digit Resolution from NGS Data
Authors: Yazhi Huang, Jing Yang, Dingge Ying, Yan Zhang, Vorasuk Shotelersuk, Nattiya Hirankarn, Pak Chung Sham, Yu Lung Lau, Wanling Yang
Abstract:
Human leukocyte antigen (HLA) typing from next generation sequencing (NGS) data has the potential for applications in clinical laboratories and population genetic studies. Here we introduce a novel technique for HLA typing from NGS data based on read-mapping using a comprehensive reference panel containing all known HLA alleles and de novo assembly of the gene-specific short reads. An accurate HLA typing at high-digit resolution was achieved when it was tested on publicly available NGS data, outperforming other newly-developed tools such as HLAminer and PHLAT.Keywords: human leukocyte antigens, next generation sequencing, whole exome sequencing, HLA typing
Procedia PDF Downloads 66326488 Early Childhood Education: Teachers Ability to Assess
Authors: Ade Dwi Utami
Abstract:
Pedagogic competence is the basic competence of teachers to perform their tasks as educators. The ability to assess has become one of the demands in teachers pedagogic competence. Teachers ability to assess is related to curriculum instructions and applications. This research is aimed at obtaining data concerning teachers ability to assess that comprises of understanding assessment, determining assessment type, tools and procedure, conducting assessment process, and using assessment result information. It uses mixed method of explanatory technique in which qualitative data is used to verify the quantitative data obtained through a survey. The technique of quantitative data collection is by test whereas the qualitative data collection is by observation, interview and documentation. Then, the analyzed data is processed through a proportion study technique to be categorized into high, medium and low. The result of the research shows that teachers ability to assess can be grouped into 3 namely, 2% of high, 4% of medium and 94% of low. The data shows that teachers ability to assess is still relatively low. Teachers are lack of knowledge and comprehension in assessment application. The statement is verified by the qualitative data showing that teachers did not state which aspect was assessed in learning, record children’s behavior, and use the data result as a consideration to design a program. Teachers have assessment documents yet they only serve as means of completing teachers administration for the certification program. Thus, assessment documents were not used with the basis of acquired knowledge. The condition should become a consideration of the education institution of educators and the government to improve teachers pedagogic competence, including the ability to assess.Keywords: assessment, early childhood education, pedagogic competence, teachers
Procedia PDF Downloads 24626487 Territorial Brand as a Means of Structuring the French Wood Industry
Authors: Laetitia Dari
Abstract:
The brand constitutes a source of differentiation between competitors. It highlights specific characteristics that create value for the enterprise. Today the concept of a brand is not just about the product but can concern territories. The competition between territories, due to tourism, research, jobs, etc., leads territories to develop territorial brands to bring out their identity and specificity. Some territorial brands are based on natural resources or products characteristic of a territory. In the French wood sector, we can observe the emergence of many territorial brands. Supported by the inter-professional organization, these brands have the main objective of showcasing wood as a source of solutions at the local level in terms of construction and energy. The implementation of these collective projects raises the question of the way in which relations between companies are structured and animated. The central question of our work is to understand how the territorial brand promotes the structuring of a sector and the construction of collective relations between actors. In other words, we are interested in the conditions for the emergence of the territorial brand and the way in which it will be a means of mobilizing the actors around a common project. The objectives of the research are (1) to understand in which context a territorial brand emerges, (2) to analyze the way in which the territorial brand structures the collective relations between actors, (3) to give entry keys to the actors to successfully develop this type of project. Thus, our research is based on a qualitative methodology with semi-structured interviews conducted with the main territorial brands in France. The research will answer various academic and empirical questions. From an academic point of view, it brings elements of understanding to the construction of a collective project and to the way in which governance operates. From an empirical point of view, the interest of our work is to bring out the key success factors in the development of a territorial brand and how the brand can become an element of valuation for a territory.Keywords: brand, marketing, strategy, territory, third party stakeholder, wood
Procedia PDF Downloads 6726486 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit
Procedia PDF Downloads 54326485 Monotone Rational Trigonometric Interpolation
Authors: Uzma Bashir, Jamaludin Md. Ali
Abstract:
This study is concerned with the visualization of monotone data using a piece-wise C1 rational trigonometric interpolating scheme. Four positive shape parameters are incorporated in the structure of rational trigonometric spline. Conditions on two of these parameters are derived to attain the monotonicity of monotone data and other two are left-free. Figures are used widely to exhibit that the proposed scheme produces graphically smooth monotone curves.Keywords: trigonometric splines, monotone data, shape preserving, C1 monotone interpolant
Procedia PDF Downloads 27126484 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels
Authors: Joshua Buli, David Pietrowski, Samuel Britton
Abstract:
Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization
Procedia PDF Downloads 8526483 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data
Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy
Abstract:
This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.Keywords: data warehouse, description logics, integration, knowledge, metadata
Procedia PDF Downloads 13826482 Urban Open Source: Synthesis of a Citizen-Centric Framework to Design Densifying Cities
Authors: Shaurya Chauhan, Sagar Gupta
Abstract:
Prominent urbanizing centres across the globe like Delhi, Dhaka, or Manila have exhibited that development often faces a challenge in bridging the gap among the top-down collective requirements of the city and the bottom-up individual aspirations of the ever-diversifying population. When this exclusion is intertwined with rapid urbanization and diversifying urban demography: unplanned sprawl, poor planning, and low-density development emerge as automated responses. In parallel, new ideas and methods of densification and public participation are being widely adopted as sustainable alternatives for the future of urban development. This research advocates a collaborative design method for future development: one that allows rapid application with its prototypical nature and an inclusive approach with mediation between the 'user' and the 'urban', purely with the use of empirical tools. Building upon the concepts and principles of 'open-sourcing' in design, the research establishes a design framework that serves the current user requirements while allowing for future citizen-driven modifications. This is synthesized as a 3-tiered model: user needs – design ideology – adaptive details. The research culminates into a context-responsive 'open source project development framework' (hereinafter, referred to as OSPDF) that can be used for on-ground field applications. To bring forward specifics, the research looks at a 300-acre redevelopment in the core of a rapidly urbanizing city as a case encompassing extreme physical, demographic, and economic diversity. The suggestive measures also integrate the region’s cultural identity and social character with the diverse citizen aspirations, using architecture and urban design tools, and references from recognized literature. This framework, based on a vision – feedback – execution loop, is used for hypothetical development at the five prevalent scales in design: master planning, urban design, architecture, tectonics, and modularity, in a chronological manner. At each of these scales, the possible approaches and avenues for open- sourcing are identified and validated, through hit-and-trial, and subsequently recorded. The research attempts to re-calibrate the architectural design process and make it more responsive and people-centric. Analytical tools such as Space, Event, and Movement by Bernard Tschumi and Five-Point Mental Map by Kevin Lynch, among others, are deep rooted in the research process. Over the five-part OSPDF, a two-part subsidiary process is also suggested after each cycle of application, for a continued appraisal and refinement of the framework and urban fabric with time. The research is an exploration – of the possibilities for an architect – to adopt the new role of a 'mediator' in development of the contemporary urbanity.Keywords: open source, public participation, urbanization, urban development
Procedia PDF Downloads 14926481 Data Analytics in Energy Management
Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair
Abstract:
With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.Keywords: energy analytics, energy management, operational data, business intelligence, optimization
Procedia PDF Downloads 36426480 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data
Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query
Procedia PDF Downloads 16126479 A Model of Teacher Leadership in History Instruction
Authors: Poramatdha Chutimant
Abstract:
The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership
Procedia PDF Downloads 27926478 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation
Authors: Mohammad Anwar, Shah Waliullah
Abstract:
This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model
Procedia PDF Downloads 6826477 Solar Energy for Decontamination of Ricinus communis
Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale
Abstract:
The solar energy was used as a source of heating in Ricinus communis pie with the objective of eliminating or minimizing the percentage of the poison in it, so that it can be used as animal feed. A solar cylinder and plane collector were used as heating system. In the focal area of the solar concentrator a gutter support endowed with stove effect was placed. Parameters that denote the efficiency of the systems for the proposed objective was analyzed.Keywords: solar energy, concentrate, Ricinus communis, temperature
Procedia PDF Downloads 42426476 The Effects of Parent Psycho-Education Program on Problem-Solving Skills of Parents
Authors: Tuba Bagatarhan, Digdem Muge Siyez
Abstract:
The aim of this research is to examine the effects of the psycho-education program on problem-solving skills of parents of high school students in the risk group for Internet addiction. A quasi-experimental design based on the pre-test, post-test and follow up test including experimental and control groups was used in the research. The independent variable of the study was the parent psycho-education program on problem-solving skills; the dependent variable was the problem-solving skills of parents. The research was conducted with the parents of 52 tenth-grade students in the risk group for Internet addiction from two high schools and volunteer to participate research on evaluation of the effectiveness of internet addiction prevention psycho-education program within the scope of another study. In this study, as 26 students were in the experimental groups in the first-high school, the parents of these 26 students were asked if they would like to participate in the parent psycho-education program on parental problem-solving skills. The parents were volunteer to participate in parent psycho-education program assigned experimental group (n=13), the other parents assigned control group 1 (n=13) in the first high school. The parents of the 26 students were randomly assigned to the control group 2 (n=13) and control group 3 (n=13) in the second high school. The data of the research was obtained via the problem behavior scale - coping - parents form and demographic questionnaire. Four-session parent psycho-education program to cope with Internet addiction and other problem behaviors in their children was applied to the experimental group. No program was applied to the control group 1, control group 2 and control group 3. In addition, an internet addiction prevention psycho-education program was applied to the children of the parents in experimental group and control group 1 within the scope of another study. In the analysis of the obtained data, two-factor variance analysis for repeated measures on one factor was used. Bonferroni post-hoc test was used to find the source of intergroup difference. According to the findings, the psycho-education program significantly increases parents’ problem-solving abilities, and the increase has continued throughout the follow-up test.Keywords: internet addiction, parents, prevention, psyho-education
Procedia PDF Downloads 18226475 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers
Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen
Abstract:
In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.Keywords: AIS, ANN, ECG, hybrid classifiers, PSO
Procedia PDF Downloads 44226474 Filling the Gaps with Representation: Netflix’s Anne with an E as a Way to Reveal What the Text Hid
Authors: Arkadiusz Adam Gardaś
Abstract:
In his theory of gaps, Wolfgang Iser states that literary texts often lack direct messages. Instead of using straightforward descriptions, authors leave the gaps or blanks, i.e., the spaces within the text that come into existence only when readers fill them with their understanding and experiences. This paper’s aim is to present Iser’s literary theory in an intersectional way by comparing it to the idea of intersemiotic translation. To be more precise, the author uses the example of Netflix’s adaption of Lucy Maud Montgomery’s Anne of Green Gables as a form of rendering a book into a film in such a way that certain textual gaps are filled with film images. Intersemiotic translation is a rendition in which signs of one kind of media are translated into the signs of the other media. Film adaptions are the most common, but not the only, type of intersemiotic translation. In this case, the role of the translator is taken by a screenwriter. A screenwriter’s role can reach beyond the direct meaning presented by the author, and instead, it can delve into the source material (here – a novel) in a deeper way. When it happens, a screenwriter is able to spot the gaps in the text and fill them with images that can later be presented to the viewers. Anne with an E, the Netflix adaption of Montgomery’s novel, may be used as a highly meaningful example of such a rendition. It is due to the fact that the 2017 series was broadcasted more than a hundred years after the first edition of the novel was published. This means that what the author might not have been able to show in her text can now be presented in a more open way. The screenwriter decided to use this opportunity to represent certain groups in the film, i.e., racial and sexual minorities, and women. Nonetheless, the series does not alter the novel; in fact, it adds to it by filling the blanks with more direct images. In the paper, fragments of the first season of Anne with an E are analysed in comparison to its source, the novel by Montgomery. The main purpose of that is to show how intersemiotic translation connected with the Iser’s literary theory can enrich the understanding of works of art, culture, media, and literature.Keywords: intersemiotic translation, film, literary gaps, representation
Procedia PDF Downloads 31626473 Enhance the Power of Sentiment Analysis
Authors: Yu Zhang, Pedro Desouza
Abstract:
Since big data has become substantially more accessible and manageable due to the development of powerful tools for dealing with unstructured data, people are eager to mine information from social media resources that could not be handled in the past. Sentiment analysis, as a novel branch of text mining, has in the last decade become increasingly important in marketing analysis, customer risk prediction and other fields. Scientists and researchers have undertaken significant work in creating and improving their sentiment models. In this paper, we present a concept of selecting appropriate classifiers based on the features and qualities of data sources by comparing the performances of five classifiers with three popular social media data sources: Twitter, Amazon Customer Reviews, and Movie Reviews. We introduced a couple of innovative models that outperform traditional sentiment classifiers for these data sources, and provide insights on how to further improve the predictive power of sentiment analysis. The modelling and testing work was done in R and Greenplum in-database analytic tools.Keywords: sentiment analysis, social media, Twitter, Amazon, data mining, machine learning, text mining
Procedia PDF Downloads 35326472 Evaluation of Ficus racemosa (Moraceae) as a Potential Source for Drug Formulation Against Coccidiosis
Authors: Naveeda Akhtar Qureshi, Wajiha
Abstract:
Coccidiosis is a protozoan parasitic disease of genus Eimeria. It is an avian infection causing a great economic loss of 3 billion USD per year globally. A number of anticoccidial drugs are in use however many of them have side effects and cost effective. With increase in poultry demand throughout the world there is a need of more drugs and vaccines against coccidiosis. The present study is based upon the use of F. racemosa a medicinal plant to be a potential source of anticoccidial agents. The methanolic leaves extract was fractionated by column and thin layer chromatography and got nineteen fractions. Each fraction different concentrations was evaluated for its anticoccidial properties in an invitro experiment against E. tenella, E. necatrix and E. mitis. The anticoccidial active fractions were further characterized by spectroscopy (UV-Vis, FTIR) and GC-MS analysis. The in silico molecular docking of active fractions identified compounds were carried out. Among all fractions significantly maximum sporulation inhibition efficacy was shown by F-19 (67.11±2.18) followed by F-15 (65.21±1.34) at concentration of 30mg/ml against E. tenella. The significantly highest sporozoites viability inhibition was shown by F-19 (69.23±2.11) followed by F-15 (67.14±1.52) against E. necatrix at concentration 30mg/ml. Anticoccidial active fractions 15 and 19 showed peak spectrum at 207 and 202nm respectively by UV analysis. Their FTIR analysis confirmed the presence of carboxylic acid, amines, phenols, etc. Anticoccidial active compounds like Cyclododecane methanol, oleic acid, Octadecanoic acid, etc were identified by GC-MS analysis. Identified compounds in silico molecular docking study showed that cyclododecane methanol of F-19 and oleic acid of F-15 showed highest binding affinity with target S-Adenosylmethionine synthase. Hence for further authentication in vivo anticoccidial studies are recommended.Keywords: ficus racemosa, cluster fig, column chromatography, anticoccidial fractions, GC-MS, molecular docking., s-adenosylmethionine synthase
Procedia PDF Downloads 8526471 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 17526470 Programming with Grammars
Authors: Peter M. Maurer Maurer
Abstract:
DGL is a context free grammar-based tool for generating random data. Many types of simulator input data require some computation to be placed in the proper format. For example, it might be necessary to generate ordered triples in which the third element is the sum of the first two elements, or it might be necessary to generate random numbers in some sorted order. Although DGL is universal in computational power, generating these types of data is extremely difficult. To overcome this problem, we have enhanced DGL to include features that permit direct computation within the structure of a context free grammar. The features have been implemented as special types of productions, preserving the context free flavor of DGL specifications.Keywords: DGL, Enhanced Context Free Grammars, Programming Constructs, Random Data Generation
Procedia PDF Downloads 14726469 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses
Authors: Ouzayr Rabhi, Ibtissam Arrassen
Abstract:
To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML
Procedia PDF Downloads 160