Search results for: Data quality
28623 Forage Quality of Chickpea - Barley as Affected by Mixed Cropping System in Water Stress Condition
Authors: Masoud Rafiee
Abstract:
To study the quality response of forage to chickpea-barley mixed cropping under drought stress and vermicompost consumption, an experiment was carried out under well watered and %70 water requirement (stress condition) in RCBD as split plot with four replications in temperate condition of Khorramabad in 2013. Chickpea-barley mix cropping (%100 chickpea, %75:25 chickpea:barley, %50:50 chickpea:barley, %25:75 chickpea:barley, and %100 barley) was studied. Results showed that wet and dry forage yield were significantly affected by environment and decreased in stress condition. Also, crude protein content decreased from %26.2 in well watered to %17.3 in stress condition.Keywords: crude protein, wet forage yield, dry forage yield, water stress condition, well watered
Procedia PDF Downloads 34328622 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data
Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee
Abstract:
Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.Keywords: data mining, evaluating new technology, technology opportunity, patent analysis
Procedia PDF Downloads 37728621 The Energy Consumption by the Sector of Transport and His Impact on the Atmospheric Pollution
Authors: Mme Hamani Née Guessas Ghaniya
Abstract:
The transport is the base of the development of the exchanges and the business, being both a recognized determiner of the economic and social development. The development of the transport is in the center of the big challenges of development of countries, but it is also at the heart of big contradictions, since we integrate the environmental issues which are bound to him, in particular through the questions of energy. Indeed, the energy consumption by the sector of transport is one of bigger concerns, because it is increasing and it has a big impact on our environment. The main consequences are, the atmospheric pollution causing an increase of the greenhouse effect which causes a global warming. These global warming risks to engender a partial cast iron of polar caps so raising the level of seas, flooding the low coastal zones, certain islands and the deltas. Thus, the purpose of this communication is to present the impact of the energy consumption by the sector of transport on the air quality, showing its effect on the health and on the global warming.Keywords: energy consumption, sector of transport, air quality, atmospheric pollution
Procedia PDF Downloads 33128620 Anomaly Detection Based on System Log Data
Authors: M. Kamel, A. Hoayek, M. Batton-Hubert
Abstract:
With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.Keywords: logs, anomaly detection, ML, scoring, NLP
Procedia PDF Downloads 9428619 Smart Development Proposals for an Indian Village
Authors: J. E. M. Macwan, D. A. Pastagia, Reeta Meena
Abstract:
Government of Gujarat (India) wishes to create smart villages to improve the quality of life. The significance of these efforts will result into higher rural productivity. The main aim of this research is to identify, design and propose implementable planning solutions suited for an Indian village set up. The methodology adopted is to create a database by conducting onsite study as well as gathering public opinion to help researchers to satisfy rural needs. The outcome of this research exercise is a planning design preparation and channelizing funds in right direction for a result oriented better village shape. The proposals are accepted after slight modifications by the stakeholders. Planning solutions were designed through public participatory approach. To control rural Urban migration, villagers were offered better quality of life.Keywords: smart village, digitization, development plan, gram panchayats
Procedia PDF Downloads 13128618 Study on the Stages of Knowledge Flow in Central Libraries of Tehran Universities by the Pattern of American Productivity & Quality Center
Authors: Amir Reza Asnafi, Ehsan Tajabadi, Mohsen Hajizeinolabedini
Abstract:
The purpose of this study is to identify the concept of knowledge flow in central libraries of Tehran universities in by the pattern of American Productivity & Quality Center (APQC). The present study is an applied and descriptive survey in terms of its purpose and the methodology used. In this study, APQC framework was used for data collection. The study population is managers and supervisors of central libraries’ departments of public universities of Tehran belonging to the Ministry of Science, Research and Technology. These libraries include: Central Libraries of Al-Zahra University, Amir Kabir, Tarbiat Modarres, Tehran, Khajeh Nasir Toosi University of Technology, Shahed, Sharif, Shahid Beheshti, Allameh Tabataba'i University, Iran University of Science and Technology. Due to the limited number of members of the community, sampling was not performed and the census was conducted instead. The study of knowledge flow in central libraries of public universities in Tehran showed that in seven dimensions of knowledge flow of APQC, these libraries are far from desirable level and to achieve the ideal point, many activities in the field of knowledge flow need to be made, therefore suggestions were made in this study to reach the desired level. One Sample t Test in this research showed that these libraries are at a poor level in terms of these factors: in the dimensions of creation, identification and use of knowledge at a medium level and in the aspects of knowledge acquisition, review, sharing and access and also Manova test or Multivariable Analyze of Variance proved that there was no significant difference between the dimensions of knowledge flow between these libraries and the status of the knowledge flow in these libraries is at the same level as well. Except for the knowledge creation aspect that is slightly different in this regard that was mentioned before.Keywords: knowledge flow, knowledge management, APQC, Tehran’s academic university libraries
Procedia PDF Downloads 16428617 Evaluation of Zooplankton Community and Saprobi Index of Carps Culture Ponds: Case Study on East of Golestan Province-Gonbade Kavous City
Authors: Mehrdad Kamali-Sanzighi, Maziar Kamali-Sanzighi
Abstract:
The aim of this research was to study zooplankton community density, diversity and Saprobi index in carp ponds at Golestan province, Gonbade Kavous city, Iran. Zooplankton sampling was done monthly in each pond during one carp culture time. Our analysis showed 27 genus from 4 groups (Protozoa 12, Rotatoria 8, Copepoda 4 and Cladocera 3). The highest and lowest frequency of zooplankton groups were belongs to Rotatoria, Copepoda, Cladocera and Protozoa with 46, 28, 23 and 3 percent, respectively. No significant differences between saprobi index of six carp ponds (P>0.05) were observed. Saprobi index indicated Class ßmesosaprob for six analysis ponds. There was a general tendency to decrease and significantly in the saprobi index with the value range of 1.52-1.70 from the beginning to end of the culture season (P<0.05). Also, gradual improvement of water quality observed toward the end of culture period and these reasons are partly a result of natural and management processed such as seasons changes (climate), water exchange (replacement of water) and pause of introduce of fertilizer materials to the ponds. According to the ability of saprobi index in monitoring of water quality condition and health of different water resources, focus to similar kind of effective research is necessary in future time.Keywords: zooplankton, saprobi pollution index, water quality, fish pond, east of Golestan Province
Procedia PDF Downloads 9928616 Effect of Different Spacings on Growth Yield and Fruit Quality of Peach in the Sub-Tropics of India
Authors: Harminder Singh, Rupinder Kaur
Abstract:
Peach is primarily a temperate fruit, but its low chilling cultivars are grown quite successfully in the sub-tropical climate as well. The area under peach cultivation is picking up rapidly in the sub tropics of northern India due to higher return on a unit area basis, availability of suitable peach cultivar and their production technology. Information on the use of different training systems on peach in the sub tropics is inadequate. In this investigation, conducted at Punjab Agricultural University, Ludhiana (Punjab), India, the trees of the Shan-i-Punjab peach were planted at four different spacings i.e. 6.0x3.0m, 6.0x2.5m, 4.5x3.0m and 4.5x2.5m and were trained to central leader system. The total radiation interception and penetration in the upper and lower canopy parts were higher in 6x3.0m and 6x2.5m planted trees as compared to other spacings. Average radiation interception was maximum in the upper part of the tree canopy, and it decreased significantly with the depth of the canopy in all the spacings. Tree planted at wider spacings produced more vegetative (tree height, tree girth, tree spread and canopy volume) and reproductive growth (flower bud density, number of fruits and fruit yield) per tree but productivity was maximum in the closely planted trees. Fruits harvested from the wider spaced trees were superior in fruit quality (size, weight, colour, TSS and acidity) and matured earlier than those harvested from closed spaced trees.Keywords: quality, radiation, spacings, yield
Procedia PDF Downloads 18828615 Embedded Digital Image System
Authors: Dawei Li, Cheng Liu, Yiteng Liu
Abstract:
This paper introduces an embedded digital image system for Chinese space environment vertical exploration sounding rocket. In order to record the flight status of the sounding rocket as well as the payloads, an onboard embedded image processing system based on ADV212, a JPEG2000 compression chip, is designed in this paper. Since the sounding rocket is not designed to be recovered, all image data should be transmitted to the ground station before the re-entry while the downlink band used for the image transmission is only about 600 kbps. Under the same condition of compression ratio compared with other algorithm, JPEG2000 standard algorithm can achieve better image quality. So JPEG2000 image compression is applied under this condition with a limited downlink data band. This embedded image system supports lossless to 200:1 real time compression, with two cameras to monitor nose ejection and motor separation, and two cameras to monitor boom deployment. The encoder, ADV7182, receives PAL signal from the camera, then output the ITU-R BT.656 signal to ADV212. ADV7182 switches between four input video channels as the program sequence. Two SRAMs are used for Ping-pong operation and one 512 Mb SDRAM for buffering high frame-rate images. The whole image system has the characteristics of low power dissipation, low cost, small size and high reliability, which is rather suitable for this sounding rocket application.Keywords: ADV212, image system, JPEG2000, sounding rocket
Procedia PDF Downloads 42128614 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 15228613 Quantification of River Ravi Pollution and Oxidation Pond Treatment to Improve the Drain Water Quality
Authors: Yusra Mahfooz, Saleha Mehmood
Abstract:
With increase in industrialization and urbanization, water contaminating rivers through effluents laden with diverse chemicals in developing countries. The study was based on the waste water quality of the four drains (Outfall, Gulshan -e- Ravi, Hudiara, and Babu Sabu) which enter into river Ravi in Lahore, Pakistan. Different pollution parameters were analyzed including pH, DO, BOD, COD, turbidity, EC, TSS, nitrates, phosphates, sulfates and fecal coliform. Approximately all the water parameters of drains were exceeded the permissible level of wastewater standards. In calculation of pollution load, Hudiara drains showed highest pollution load in terms of COD i.e. 429.86 tons/day while in Babu Sabu drain highest pollution load was calculated in terms of BOD i.e. 162.82 tons/day (due to industrial and sewage discharge in it). Lab scale treatment (oxidation ponds) was designed in order to treat the waste water of Babu Sabu drain, through combination of different algae species i.e. chaetomorphasutoria, sirogoniumsticticum and zygnema sp. Two different sizes of ponds (horizontal and vertical), and three different concentration of algal samples (25g/3L, 50g/3L, and 75g/3L) were selected. After 6 days of treatment, 80 to 97% removal efficiency was found in the pollution parameters. It was observed that in the vertical pond, maximum reduction achieved i.e. turbidity 62.12%, EC 79.3%, BOD 86.6%, COD 79.72%, FC 100%, nitrates 89.6%, sulphates 96.9% and phosphates 85.3%. While in the horizontal pond, the maximum reduction in pollutant parameters, turbidity 69.79%, EC 83%, BOD 88.5%, COD 83.01%, FC 100%, nitrates 89.8%, sulphates 97% and phosphates 86.3% was observed. Overall treatment showed that maximum reduction was carried out in 50g algae setup in the horizontal pond due to large surface area, after 6 days of treatment. Results concluded that algae-based treatment are most energy efficient, which can improve drains water quality in cost effective manners.Keywords: oxidation pond, ravi pollution, river water quality, wastewater treatment
Procedia PDF Downloads 29828612 Food Safety Aspects of Pesticide Residues in Spice Paprika
Authors: Sz. Klátyik, B. Darvas, M. Mörtl, M. Ottucsák, E. Takács, H. Bánáti, L. Simon, G. Gyurcsó, A. Székács
Abstract:
Environmental and health safety of condiments used for spicing food products in food processing or by culinary means receive relatively low attention, even though possible contamination of spices may affect food quality and safety. Contamination surveys mostly focus on microbial contaminants or their secondary metabolites, mycotoxins. Chemical contaminants, particularly pesticide residues, however, are clearly substantial factors in the case of given condiments in the Capsicum family including spice paprika and chilli. To assess food safety and support the quality of the Hungaricum product spice paprika, the pesticide residue status of spice paprika and chilli is assessed on the basis of reported pesticide contamination cases and non-compliances in the Rapid Alert System for Food and Feed of the European Union since 1998.Keywords: spice paprika, Capsicum, pesticide residues, RASFF
Procedia PDF Downloads 39428611 The Impact of Financial Reporting on Sustainability
Authors: Lynn Ruggieri
Abstract:
The worldwide pandemic has only increased sustainability awareness. The public is demanding that businesses be held accountable for their impact on the environment. While financial data enjoys uniformity in reporting requirements, there are no uniform reporting requirements for non-financial data. Europe is leading the way with some standards being implemented for reporting non-financial sustainability data; however, there is no uniformity globally. And without uniformity, there is not a clear understanding of what information to include and how to disclose it. Sustainability reporting will provide important information to stakeholders and will enable businesses to understand their impact on the environment. Therefore, there is a crucial need for this data. This paper looks at the history of sustainability reporting in the countries of the European Union and throughout the world and makes a case for worldwide reporting requirements for sustainability.Keywords: financial reporting, non-financial data, sustainability, global financial reporting
Procedia PDF Downloads 17828610 Postharvest Losses and Handling Improvement of Organic Pak-Choi and Choy Sum
Authors: Pichaya Poonlarp, Danai Boonyakiat, C. Chuamuangphan, M. Chanta
Abstract:
Current consumers’ behavior trends have changed towards more health awareness, the well-being of society and interest of nature and environment. The Royal Project Foundation is, therefore, well aware of organic agriculture. The project only focused on using natural products and utilizing its highland biological merits to increase resistance to diseases and insects for the produce grown. The project also brought in basic knowledge from a variety of available research information, including, but not limited to, improvement of soil fertility and a control of plant insects with biological methods in order to lay a foundation in developing and promoting farmers to grow quality produce with a high health safety. This will finally lead to sustainability for future highland agriculture and a decrease of chemical use on the highland area which is a source of natural watershed. However, there are still shortcomings of the postharvest management in term of quality and losses, such as bruising, rottenness, wilting and yellowish leaves. These losses negatively affect the maintenance and a shelf life of organic vegetables. Therefore, it is important that a research study of the appropriate and effective postharvest management is conducted for an individual organic vegetable to minimize product loss and find root causes of postharvest losses which would contribute to future postharvest management best practices. This can be achieved through surveys and data collection from postharvest processes in order to conduct analysis for causes of postharvest losses of organic pak-choi, baby pak-choi, and choy sum. Consequently, postharvest losses reduction strategies of organic vegetables can be achieved. In this study, postharvest losses of organic pak choi, baby pak-choi, and choy sum were determined at each stage of the supply chain starting from the field after harvesting, at the Development Center packinghouse, at Chiang Mai packinghouse, at Bangkok packing house and at the Royal Project retail shop in Chiang Mai. The results showed that postharvest losses of organic pak-choi, baby pak-choi, and choy sum were 86.05, 89.05 and 59.03 percent, respectively. The main factors contributing to losses of organic vegetables were due to mechanical damage and underutilized parts and/or short of minimum quality standard. Good practices had been developed after causes of losses were identified. Appropriate postharvest handling and management, for example, temperature control, hygienic cleaning, and reducing the duration of the supply chain, postharvest losses of all organic vegetables should be able to remarkably reduced postharvest losses in the supply chain.Keywords: postharvest losses, organic vegetables, handling improvement, shelf life, supply chain
Procedia PDF Downloads 47528609 Religio-Cultural Ethos and Mental Health
Authors: Haveesha Buddhdev
Abstract:
The most important right for a human being in a society is the freedom of expression as stated by Article 18 and 19 of the Universal Declaration of Human rights pledged by member states of United Nations. Will it be fair to expect him/her to be of sound mental health if this right is taken away? Religion as a primary social institution controls many rights, freedoms and duties of people in a society. It does so by imposing certain values and beliefs on people which would either enhance quality of life or curb their freedom adversely thus affecting individual mental health. This paper aims to study the positive and negative role that religion plays in influencing one’s freedom of expression. This paper will focus on reviewing existing studies on the positive and negative impacts of religion on mental health. It will also contain data collected by the researcher about the impacts of religion on freedom of expression which will be obtained by surveying a sample of 30 adolescents and young adults. The researcher will use a Likert scale for these purpose, with response options ranging from strongly disagree to strongly agree and quantify it accordingly. Descriptive statistics would be used to analyse the data. Such research would help to identify possible problems faced by adolescents and young adults when it comes to religio-cultural ethos and also facilitate further researches to study the role that religion plays in mental health.Keywords: cultural Ethos, freedom of expression, adolescent mental health, social science
Procedia PDF Downloads 44928608 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany
Authors: Bara' Al-Mistarehi
Abstract:
Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure
Procedia PDF Downloads 17828607 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies
Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk
Abstract:
Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)
Procedia PDF Downloads 9328606 Emotional Labour and Employee Performance Appraisal: The Missing Link in Some Hotels in South East Nigeria
Authors: Polycarp Igbojekwe
Abstract:
The main objective of this study was to determine if emotional labour has become a criterion in performance appraisal, job description, selection, and training schemes in the hotel industry in Nigeria. Our main assumption was that majority of hotel organizations have not built emotional labour into their human resources management schemes. Data were gathered by the use of structured questionnaires designed in Likert format, and interviews. The focus group was managers of the selected hotels. Analyses revealed that majority of the hotels have not built emotional labour into their human resources schemes particularly in the 1, 2, and 3-star hotels. It was observed that service employees of 1, 2, and 3-star hotels have not been adequately trained to perform emotional labour; a critical factor in quality service delivery. Managers of 1, 2, and 3-star hotels have not given serious thought to emotional labour as a critical factor in quality service delivery. The study revealed that suitability of an individual’s characteristics is not being considered as a criterion for selection and performance appraisal for service employees. The implication of this is that, person-job-fit is not seriously considered. It was observed that there has been a disconnect between required emotional competency, its recognition, evaluation, and training. Based on the findings of this study, it is concluded that selection, training, job description and performance appraisal instruments in use in hotels in Nigeria are inadequate. Human resource implications of the findings in this study are presented. It is recommended that hotel organizations should re-design and plan the emotional content and context of their human resources practices to reflect the emotional demands of front line jobs in the hotel industry and the crucial role emotional labour plays during service encounters.Keywords: emotional labour, employee selection, job description, performance appraisal, person-job-fit, employee compensation
Procedia PDF Downloads 19228605 Mapping Tunnelling Parameters for Global Optimization in Big Data via Dye Laser Simulation
Authors: Sahil Imtiyaz
Abstract:
One of the biggest challenges has emerged from the ever-expanding, dynamic, and instantaneously changing space-Big Data; and to find a data point and inherit wisdom to this space is a hard task. In this paper, we reduce the space of big data in Hamiltonian formalism that is in concordance with Ising Model. For this formulation, we simulate the system using dye laser in FORTRAN and analyse the dynamics of the data point in energy well of rhodium atom. After mapping the photon intensity and pulse width with energy and potential we concluded that as we increase the energy there is also increase in probability of tunnelling up to some point and then it starts decreasing and then shows a randomizing behaviour. It is due to decoherence with the environment and hence there is a loss of ‘quantumness’. This interprets the efficiency parameter and the extent of quantum evolution. The results are strongly encouraging in favour of the use of ‘Topological Property’ as a source of information instead of the qubit.Keywords: big data, optimization, quantum evolution, hamiltonian, dye laser, fermionic computations
Procedia PDF Downloads 19428604 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues
Authors: Muhammad Muhammad Suleiman
Abstract:
Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.Keywords: cloud computing, steganography, information hiding, cloud storage, security
Procedia PDF Downloads 19228603 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics
Authors: Farhad Asadi, Mohammad Javad Mollakazemi
Abstract:
In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm
Procedia PDF Downloads 42628602 Aspects Concerning the Use of Recycled Concrete Aggregates
Authors: Ion Robu, Claudiu Mazilu, Radu Deju
Abstract:
Natural aggregates (gravel and crushed) are essential non-renewable resources which are used for infrastructure works and civil engineering. In European Union member states from Southeast Europe, it is estimated that the construction industry will grow by 4.2% thereafter complicating aggregate supply management. In addition, a significant additional problem that can be associated to the aggregates industry is wasting potential resources through waste dumping of inert waste, especially waste from construction and demolition activities. In 2012, in Romania, less than 10% of construction and demolition waste (including concrete) are valorized, while the European Union requires that by 2020 this proportion should be at least 70% (Directive 2008/98/EC on waste, transposed into Romanian legislation by Law 211/2011). Depending on the efficiency of waste processing and the quality of recycled aggregate concrete (RCA) obtained, poor quality aggregate can be used as foundation material for roads and at the high quality for new concrete on construction. To obtain good quality concrete using recycled aggregate is necessary to meet the minimum requirements defined by the rules for the manufacture of concrete with natural aggregate. Properties of recycled aggregate (density, granulosity, granule shape, water absorption, weight loss to Los Angeles test, attached mortar content etc.) are the basis for concrete quality; also establishing appropriate proportions between components and the concrete production methods are extremely important for its quality. This paper presents a study on the use of recycled aggregates, from a concrete of specified class, to acquire new cement concrete with different percentages of recycled aggregates. To achieve recycled aggregates several batches of concrete class C16/20, C25/30 and C35/45 were made, the compositions calculation being made according NE012/2007 CP012/2007. Tests for producing recycled aggregate was carried out using concrete samples of the established three classes after 28 days of storage under the above conditions. Cubes with 150mm side were crushed in a first stage with a jaw crusher Liebherr type set at 50 mm nominally. The resulting material was separated by sieving on granulometric sorts and 10-50 sort was used for preliminary tests of crushing in the second stage with a jaw crusher BB 200 Retsch model, respectively a hammer crusher Buffalo Shuttle WA-12-H model. It was highlighted the influence of the type of crusher used to obtain recycled aggregates on granulometry and granule shape and the influence of the attached mortar on the density, water absorption, behavior to the Los Angeles test etc. The proportion of attached mortar was determined and correlated with provenance concrete class of the recycled aggregates and their granulometric sort. The aim to characterize the recycled aggregates is their valorification in new concrete used in construction. In this regard have been made a series of concrete in which the recycled aggregate content was varied from 0 to 100%. The new concrete were characterized by point of view of the change in the density and compressive strength with the proportion of recycled aggregates. It has been shown that an increase in recycled aggregate content not necessarily mean a reduction in compressive strength, quality of the aggregate having a decisive role.Keywords: recycled concrete aggregate, characteristics, recycled aggregate concrete, properties
Procedia PDF Downloads 21528601 Compromising Quality of Life in Low Income Settlemnt’s: The Case of Ashrayan Prakalpa Prakalpa, Khulna
Authors: Salma Akter, Md. Kamal Uddin
Abstract:
Ashrayan (shelter) Prakalpa – a fully subsidized ‘integrated poverty eradication program’ through the provisioning of shelter of Bangladesh Government (GoB) targeting the internally displaced and homeless. In spite of the inclusiveness (poverty alleviation, employment opportunity, Tenure ship and training) of the shelter policy, dwellers are not merely questioned by the issue of 'the quality of life' .This study demonstrates how top-down policies, ambiguous ownership status of land and dwelling environments lead to ‘everyday compromise’ by the grassroots in both subjective (satisfaction, comfort and safety) and objective (physical design elements and physical environmental elements) issues in three respective scale macro (neighborhood) meso (shelter /built environment) and micro(family). It shows that by becoming subject to Government’s resettlements policies and after becoming user of its shelter units (although locally known as ‘barracks’ rather shelter or housing), the once displaced settlers assume a curious form of spatial practice where both social and spatial often bear slippery meanings. Thus, Policy-based shelter force the dwellers frequently compromise with their provided built environments and spaces within the settlements both in overtly and covertly. Compromises are made during the production of space and forms, whereas interesting new spaces and space-making practices emerge. The settlements under study are Dakshin Chandani Mahal Ashrayan Prakalpa located at the Eastern fringe area of Khulna, Bangladesh. In terms of methodology, this research is primarily exploratory and assumes a qualitative approach. Key tools used to obtain information are policy analysis, literature review, key informant interview, focus group discussion and participant observation at the level of dwelling and settlements. Necessary drawings and photographs have been taken to promote the study objective. Findings revealed that various shortages, inadequacies and negligence of policymakers make a compromising character of displaced by the means of 'quality of life' both in objective and subjective ground. Thus the study ends up with a recommendation to the policymakers to take an initiative to ensure the quality of life of the dwellers.Keywords: Ashrayan, compromise, displaced people, quality of life
Procedia PDF Downloads 33828600 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining
Authors: İbrahi̇m Kara, Seher Arslankaya
Abstract:
Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.Keywords: data mining, decision support systems, heart attack, health sector
Procedia PDF Downloads 35628599 Application of Metaverse Service to Construct Nursing Education Theory and Platform in the Post-pandemic Era
Authors: Chen-Jung Chen, Yi-Chang Chen
Abstract:
While traditional virtual reality and augmented reality only allow for small movement learning and cannot provide a truly immersive teaching experience to give it the illusion of movement, the new technology of both content creation and immersive interactive simulation of the metaverse can just reach infinite close to the natural teaching situation. However, the mixed reality virtual classroom of metaverse has not yet explored its theory, and it is rarely implemented in the situational simulation teaching of nursing education. Therefore, in the first year, the study will intend to use grounded theory and case study methods and in-depth interviews with nursing education and information experts. Analyze the interview data to investigate the uniqueness of metaverse development. The proposed analysis will lead to alternative theories and methods for the development of nursing education. In the second year, it will plan to integrate the metaverse virtual situation simulation technology into the alternate teaching strategy in the pediatric nursing technology course and explore the nursing students' use of this teaching method as the construction of personal technology and experience. By leveraging the unique features of distinct teaching platforms and developing processes to deliver alternative teaching strategies in a nursing technology teaching environment. The aim is to increase learning achievements without compromising teaching quality and teacher-student relationships in the post-pandemic era. A descriptive and convergent mixed methods design will be employed. Sixty third-grade nursing students will be recruited to participate in the research and complete the pre-test. The students in the experimental group (N=30) agreed to participate in 4 real-time mixed virtual situation simulation courses in self-practice after class and conducted qualitative interviews after each 2 virtual situation courses; the control group (N=30) adopted traditional practice methods of self-learning after class. Both groups of students took a post-test after the course. Data analysis will adopt descriptive statistics, paired t-tests, one-way analysis of variance, and qualitative content analysis. This study addresses key issues in the virtual reality environment for teaching and learning within the metaverse, providing valuable lessons and insights for enhancing the quality of education. The findings of this study are expected to contribute useful information for the future development of digital teaching and learning in nursing and other practice-based disciplines.Keywords: metaverse, post-pandemic era, online virtual classroom, immersive teaching
Procedia PDF Downloads 6828598 Tunnel Convergence Monitoring by Distributed Fiber Optics Embedded into Concrete
Authors: R. Farhoud, G. Hermand, S. Delepine-lesoille
Abstract:
Future underground facility of French radioactive waste disposal, named Cigeo, is designed to store intermediate and high level - long-lived French radioactive waste. Intermediate level waste cells are tunnel-like, about 400m length and 65 m² section, equipped with several concrete layers, which can be grouted in situ or composed of tunnel elements pre-grouted. The operating space into cells, to allow putting or removing waste containers, should be monitored for several decades without any maintenance. To provide the required information, design was performed and tested in situ in Andra’s underground laboratory (URL) at 500m under the surface. Based on distributed optic fiber sensors (OFS) and backscattered Brillouin for strain and Raman for temperature interrogation technics, the design consists of 2 loops of OFS, at 2 different radiuses, around the monitored section (Orthoradiale strains) and longitudinally. Strains measured by distributed OFS cables were compared to classical vibrating wire extensometers (VWE) and platinum probes (Pt). The OFS cables were composed of 2 cables sensitive to strains and temperatures and one only for temperatures. All cables were connected, between sensitive part and instruments, to hybrid cables to reduce cost. The connection has been made according to 2 technics: splicing fibers in situ after installation or preparing each fiber with a connector and only plugging them together in situ. Another challenge was installing OFS cables along a tunnel mad in several parts, without interruption along several parts. First success consists of the survival rate of sensors after installation and quality of measurements. Indeed, 100% of OFS cables, intended for long-term monitoring, survived installation. Few new configurations were tested with relative success. Measurements obtained were very promising. Indeed, after 3 years of data, no difference was observed between cables and connection methods of OFS and strains fit well with VWE and Pt placed at the same location. Data, from Brillouin instrument sensitive to strains and temperatures, were compensated with data provided by Raman instrument only sensitive to temperature and into a separated fiber. These results provide confidence in the next steps of the qualification processes which consists of testing several data treatment approach for direct analyses.Keywords: monitoring, fiber optic, sensor, data treatment
Procedia PDF Downloads 12928597 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder
Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen
Abstract:
Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.Keywords: count data, meta-analytic prior, negative binomial, poisson
Procedia PDF Downloads 11728596 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis
Authors: John Gaber
Abstract:
Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)
Procedia PDF Downloads 48428595 Assessing of Social Comfort of the Russian Population with Big Data
Authors: Marina Shakleina, Konstantin Shaklein, Stanislav Yakiro
Abstract:
The digitalization of modern human life over the last decade has facilitated the acquisition, storage, and processing of data, which are used to detect changes in consumer preferences and to improve the internal efficiency of the production process. This emerging trend has attracted academic interest in the use of big data in research. The study focuses on modeling the social comfort of the Russian population for the period 2010-2021 using big data. Big data provides enormous opportunities for understanding human interactions at the scale of society with plenty of space and time dynamics. One of the most popular big data sources is Google Trends. The methodology for assessing social comfort using big data involves several steps: 1. 574 words were selected based on the Harvard IV-4 Dictionary adjusted to fit the reality of everyday Russian life. The set of keywords was further cleansed by excluding queries consisting of verbs and words with several lexical meanings. 2. Search queries were processed to ensure comparability of results: the transformation of data to a 10-point scale, elimination of popularity peaks, detrending, and deseasoning. The proposed methodology for keyword search and Google Trends processing was implemented in the form of a script in the Python programming language. 3. Block and summary integral indicators of social comfort were constructed using the first modified principal component resulting in weighting coefficients values of block components. According to the study, social comfort is described by 12 blocks: ‘health’, ‘education’, ‘social support’, ‘financial situation’, ‘employment’, ‘housing’, ‘ethical norms’, ‘security’, ‘political stability’, ‘leisure’, ‘environment’, ‘infrastructure’. According to the model, the summary integral indicator increased by 54% and was 4.631 points; the average annual rate was 3.6%, which is higher than the rate of economic growth by 2.7 p.p. The value of the indicator describing social comfort in Russia is determined by 26% by ‘social support’, 24% by ‘education’, 12% by ‘infrastructure’, 10% by ‘leisure’, and the remaining 28% by others. Among 25% of the most popular searches, 85% are of negative nature and are mainly related to the blocks ‘security’, ‘political stability’, ‘health’, for example, ‘crime rate’, ‘vulnerability’. Among the 25% most unpopular queries, 99% of the queries were positive and mostly related to the blocks ‘ethical norms’, ‘education’, ‘employment’, for example, ‘social package’, ‘recycling’. In conclusion, the introduction of the latent category ‘social comfort’ into the scientific vocabulary deepens the theory of the quality of life of the population in terms of the study of the involvement of an individual in the society and expanding the subjective aspect of the measurements of various indicators. Integral assessment of social comfort demonstrates the overall picture of the development of the phenomenon over time and space and quantitatively evaluates ongoing socio-economic policy. The application of big data in the assessment of latent categories gives stable results, which opens up possibilities for their practical implementation.Keywords: big data, Google trends, integral indicator, social comfort
Procedia PDF Downloads 20028594 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network
Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang
Abstract:
As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.Keywords: GUI, deep learning, GAN, data augmentation
Procedia PDF Downloads 184