Search results for: real estate price prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8224

Search results for: real estate price prediction

7264 Structure Conduct and Performance of Rice Milling Industry in Sri Lanka

Authors: W. A. Nalaka Wijesooriya

Abstract:

The increasing paddy production, stabilization of domestic rice consumption and the increasing dynamism of rice processing and domestic markets call for a rethinking of the general direction of the rice milling industry in Sri Lanka. The main purpose of the study was to explore levels of concentration in rice milling industry in Polonnaruwa and Hambanthota which are the major hubs of the country for rice milling. Concentration indices reveal that the rice milling industry in Polonnaruwa operates weak oligopsony and is highly competitive in Hambanthota. According to the actual quantity of paddy milling per day, 47 % is less than 8Mt/Day, while 34 % is 8-20 Mt/day, and the rest (19%) is greater than 20 Mt/day. In Hambanthota, nearly 50% of the mills belong to the range of 8-20 Mt/day. Lack of experience of the milling industry, poor knowledge on milling technology, lack of capital and finding an output market are the major entry barriers to the industry. Major problems faced by all the rice millers are the lack of a uniform electricity supply and low quality paddy. Many of the millers emphasized that the rice ceiling price is a constraint to produce quality rice. More than 80% of the millers in Polonnaruwa which is the major parboiling rice producing area have mechanical dryers. Nearly 22% millers have modern machineries like color sorters, water jet polishers. Major paddy purchasing method of large scale millers in Polonnaruwa is through brokers. In Hambanthota major channel is miller purchasing from paddy farmers. Millers in both districts have major rice selling markets in Colombo and suburbs. Huge variation can be observed in the amount of pledge (for paddy storage) loans. There is a strong relationship among the storage ability, credit affordability and the scale of operation of rice millers. The inter annual price fluctuation ranged 30%-35%. Analysis of market margins by using series of secondary data shows that farmers’ share on rice consumer price is stable or slightly increases in both districts. In Hambanthota a greater share goes to the farmer. Only four mills which have obtained the Good Manufacturing Practices (GMP) certification from Sri Lanka Standards Institution can be found. All those millers are small quantity rice exporters. Priority should be given for the Small and medium scale millers in distribution of storage paddy of PMB during the off season. The industry needs a proper rice grading system, and it is recommended to introduce a ceiling price based on graded rice according to the standards. Both husk and rice bran were underutilized. Encouraging investment for establishing rice oil manufacturing plant in Polonnaruwa area is highly recommended. The current taxation procedure needs to be restructured in order to ensure the sustainability of the industry.

Keywords: conduct, performance, structure (SCP), rice millers

Procedia PDF Downloads 328
7263 A Systematic Review on Orphan Drugs Pricing, and Prices Challenges

Authors: Seyran Naghdi

Abstract:

Background: Orphan drug development is limited by very high costs attributed to the research and development and small size market. How health policymakers address this challenge to consider both supply and demand sides need to be explored for directing the policies and plans in the right way. The price is an important signal for pharmaceutical companies’ profitability and the patients’ accessibility as well. Objective: This study aims to find out the orphan drugs' price-setting patterns and approaches in health systems through a systematic review of the available evidence. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) approach was used. MedLine, Embase, and Web of Sciences were searched via appropriate search strategies. Through Medical Subject Headings (MeSH), the appropriate terms for pricing were 'cost and cost analysis', and it was 'orphan drug production', and 'orphan drug', for orphan drugs. The critical appraisal was performed by the Joanna-Briggs tool. A Cochrane data extraction form was used to obtain the data about the studies' characteristics, results, and conclusions. Results: Totally, 1,197 records were found. It included 640 hits from Embase, 327 from Web of Sciences, and 230 MedLine. After removing the duplicates, 1,056 studies remained. Of them, 924 studies were removed in the primary screening phase. Of them, 26 studies were included for data extraction. The majority of the studies (>75%) are from developed countries, among them, approximately 80% of the studies are from European countries. Approximately 85% of evidence has been produced in the recent decade. Conclusions: There is a huge variation of price-setting among countries, and this is related to the specific pharmacological market structure and the thresholds that governments want to intervene in the process of pricing. On the other hand, there is some evidence on the availability of spaces to reduce the very high costs of orphan drugs development through an early agreement between pharmacological firms and governments. Further studies need to focus on how the governments could incentivize the companies to agree on providing the drugs at lower prices.

Keywords: orphan drugs, orphan drug production, pricing, costs, cost analysis

Procedia PDF Downloads 163
7262 Numerical Prediction of Entropy Generation in Heat Exchangers

Authors: Nadia Allouache

Abstract:

The concept of second law is assumed to be important to optimize the energy losses in heat exchangers. The present study is devoted to the numerical prediction of entropy generation due to heat transfer and friction in a double tube heat exchanger partly or fully filled with a porous medium. The goal of this work is to find the optimal conditions that allow minimizing entropy generation. For this purpose, numerical modeling based on the control volume method is used to describe the flow and heat transfer phenomena in the fluid and the porous medium. Effects of the porous layer thickness, its permeability, and the effective thermal conductivity have been investigated. Unexpectedly, the fully porous heat exchanger yields a lower entropy generation than the partly porous case or the fluid case even if the friction increases the entropy generation.

Keywords: heat exchangers, porous medium, second law approach, turbulent flow

Procedia PDF Downloads 300
7261 Virtualization of Production Using Digital Twin Technology

Authors: Bohuslava Juhasova, Igor Halenar, Martin Juhas

Abstract:

The contribution deals with the current situation in modern manufacturing enterprises, which is affected by digital virtualization of different parts of the production process. The overview part of this article points to the fact, that wide informatization of all areas causes substitution of real elements and relationships between them with their digital, often virtual images, in real practice. Key characteristics of the systems implemented using digital twin technology along with essential conditions for intelligent products deployment were identified across many published studies. The goal was to propose a template for the production system realization using digital twin technology as a supplement to standardized concepts for Industry 4.0. The main resulting idea leads to the statement that the current trend of implementation of the new technologies and ways of communication between industrial facilities erases the boundaries between the real environment and the virtual world.

Keywords: communication, digital twin, Industry 4.0, simulation, virtualization

Procedia PDF Downloads 248
7260 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries

Authors: Tetsuji Tanaka, Jin Guo

Abstract:

The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.

Keywords: food security, GARCH, grain self-sufficiency, volatility transmission

Procedia PDF Downloads 155
7259 The Role of Artificial Intelligence in Concrete Constructions

Authors: Ardalan Tofighi Soleimandarabi

Abstract:

Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.

Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability

Procedia PDF Downloads 15
7258 Assessing the Influence of Station Density on Geostatistical Prediction of Groundwater Levels in a Semi-arid Watershed of Karnataka

Authors: Sakshi Dhumale, Madhushree C., Amba Shetty

Abstract:

The effect of station density on the geostatistical prediction of groundwater levels is of critical importance to ensure accurate and reliable predictions. Monitoring station density directly impacts the accuracy and reliability of geostatistical predictions by influencing the model's ability to capture localized variations and small-scale features in groundwater levels. This is particularly crucial in regions with complex hydrogeological conditions and significant spatial heterogeneity. Insufficient station density can result in larger prediction uncertainties, as the model may struggle to adequately represent the spatial variability and correlation patterns of the data. On the other hand, an optimal distribution of monitoring stations enables effective coverage of the study area and captures the spatial variability of groundwater levels more comprehensively. In this study, we investigate the effect of station density on the predictive performance of groundwater levels using the geostatistical technique of Ordinary Kriging. The research utilizes groundwater level data collected from 121 observation wells within the semi-arid Berambadi watershed, gathered over a six-year period (2010-2015) from the Indian Institute of Science (IISc), Bengaluru. The dataset is partitioned into seven subsets representing varying sampling densities, ranging from 15% (12 wells) to 100% (121 wells) of the total well network. The results obtained from different monitoring networks are compared against the existing groundwater monitoring network established by the Central Ground Water Board (CGWB). The findings of this study demonstrate that higher station densities significantly enhance the accuracy of geostatistical predictions for groundwater levels. The increased number of monitoring stations enables improved interpolation accuracy and captures finer-scale variations in groundwater levels. These results shed light on the relationship between station density and the geostatistical prediction of groundwater levels, emphasizing the importance of appropriate station densities to ensure accurate and reliable predictions. The insights gained from this study have practical implications for designing and optimizing monitoring networks, facilitating effective groundwater level assessments, and enabling sustainable management of groundwater resources.

Keywords: station density, geostatistical prediction, groundwater levels, monitoring networks, interpolation accuracy, spatial variability

Procedia PDF Downloads 58
7257 The Real Consignee: An Exploratory Study of the True Party who is Entitled to Receive Cargo under Bill of Lading

Authors: Mojtaba Eshraghi Arani

Abstract:

According to the international conventions for the carriage of goods by sea, the consignee is the person who is entitled to take delivery of the cargo from the carrier. Such a person is usually named in the relevant box of the bill of lading unless the latter is issued “To Order” or “To Bearer”. However, there are some cases in which the apparent consignee, as above, was not intended to take delivery of cargo, like the L/C issuing bank or the freight forwarder who are named as consignee only for the purpose of security or acceleration of transit process. In such cases as well as the BL which is issued “To Order”, the so-called “real consignee” can be found out in the “Notify Party” box. The dispute revolves around the choice between apparent consignee and real consignee for being entitled not only to take delivery of the cargo but also to sue the carrier for any damages or loss. While it is a generally accepted rule that only the apparent consignee shall be vested with such rights, some courts like France’s Cour de Cassation have declared that the “Notify Party”, as the real consignee, was entitled to sue the carrier and in some cases, the same court went far beyond and permitted the real consignee to take suit even where he was not mentioned on the BL as a “Notify Party”. The main argument behind such reasoning is that the real consignee is the person who suffered the loss and thus had a legitimate interest in bringing action; of course, the real consignee must prove that he incurred a loss. It is undeniable that the above-mentioned approach is contrary to the position of the international conventions on the express definition of consignee. However, international practice has permitted the use of BL in a different way to meet the business requirements of banks, freight forwarders, etc. Thus, the issue is one of striking a balance between the international conventions on the one hand and existing practices on the other hand. While the latest convention applicable for sea transportation, i.e., the Rotterdam Rules, dealt with the comparable issue of “shipper” and “documentary shipper”, it failed to cope with the matter being discussed. So a new study is required to propose the best solution for amending the current conventions for carriage of goods by sea. A qualitative method with the concept of interpretation of data collection has been used in this article. The source of the data is the analysis of domestic and international regulations and cases. It is argued in this manuscript that the judge is not allowed to recognize any one as real consignee, other than the person who is mentioned in the “Consingee” box unless the BL is issued “To Order” or “To Bearer”. Moreover, the contract of carriage is independent of the sale contract and thus, the consignee must be determined solely based on the facts of the BL itself, like “Notify Party” and not any other contract or document.

Keywords: real consignee, cargo, delivery, to order, notify the party

Procedia PDF Downloads 79
7256 Three Visions of a Conflict: The Case of La Araucania, Chile

Authors: Maria Barriga

Abstract:

The article focuses on the analysis of three images of the last five years that represent different visions of social groups in the context of the so call “Conflicto Mapuche” in la Araucanía, Chile. Using a multimodal social semiotic approach, we analyze the meaning making of these images and the social groups strategies to achieve visibility and recognition in political contexts. We explore the making and appropriation of symbols and concepts and analyze the different strategies that groups use to built hegemonic views. Among these strategies, we compare the use of digital technologies in design these images and the influence of Chilean Estate's vision on the Mapuche political conflict. Finally, we propose visual strategies to improve basic conditions for dialogue and recognition among these groups.

Keywords: visual culture, power, conflict, indigenous people

Procedia PDF Downloads 285
7255 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 108
7254 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title

Authors: Gangmin Li, Fan Yang

Abstract:

Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.

Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting

Procedia PDF Downloads 53
7253 Electrokinetic Regulation of Flow in Microcrack Reservoirs

Authors: Aslanova Aida Ramiz

Abstract:

One of the important aspects of rheophysical problems in oil and gas extraction is the regulation of thermohydrodynamic properties of liquid systems using physical and physicochemical methods. It is known that the constituent parts of real fluid systems in oil and gas production are practically non-conducting, non-magnetically active components. Real heterogeneous hydrocarbon systems, from the structural point of view, consist of an infinite number of microscopic local ion-electrostatic cores distributed in the volume of the dispersion medium. According to Cohen's rule, double electric layers are formed at the contact boundaries of components in contact (oil-gas, oil-water, water-condensate, etc.) in a heterogeneous system, and as a result, each real fluid system can be represented as a complex composition of a set of local electrostatic fields. The electrokinetic properties of this structure are characterized by a certain electrode potential. Prof. F.H. Valiyev called this potential the α-factor and came up with the idea that many natural and technological rheophysical processes (effects) are essentially electrokinetic in nature, and by changing the α-factor, it is possible to adjust the physical properties of real hydraulic systems, including thermohydrodynamic parameters. Based on this idea, extensive research work was conducted, and the possibility of reducing hydraulic resistances and improving rheological properties was experimentally discovered in real liquid systems by reducing the electrical potential with various physical and chemical methods.

Keywords: microcracked, electrode potential, hydraulic resistance, Newtonian fluid, rheophysical properties

Procedia PDF Downloads 77
7252 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 420
7251 Transformation of Traditional Marketplaces in an Urban Context: Case of Chalai Market, Thiruvananthapuram

Authors: Aswathy Vijayan, Sharath Sunder Rajeev

Abstract:

Trade has been fundamental in the footprint of human civilization since ancient time. In most of the historic cities, city development was along trading routes, where marketplaces are the major entrance to a city and hence a major element of the urban fabric. Marketplaces are where the commercial activities flourish, people, having a sense of belonging to the place, where they easily fit in. Acknowledging the built environment in and around the market in a way, creating a sense of place is an important factor in the success of public spaces. Local markets are developed in an organic manner, which adds on to the people experience and perception of urban space. With the city development, the commercial needs within the city increase, hence marketplaces flourish, irrespective of the functional segregation within. The work-live culture in the marketplaces diminishes as the commercial expansion washes away the residential patches within it. Real estate flourishes as the newer infills are without considering the carrying capacity of the place. Chalai market is a prominent business center serving the regional level of Thiruvananthapuram city. The transformation trend of marketplaces in city cores are understood from case study on Fatimid Cairo Marketplace. The parameters that led to transformation of marketplaces in a global context is considered for the analysis of the Chalai market. The structure of the marketplace over the years is analyzed in terms of transformation in location, transformation in the land- use, change in commodity, and transformation in movement and activity. The aim of the research is to emphasize the need to understand the transformation trend, in creating a suitable development pattern for the city. The unregulated transformation within the city core has led to tremendous transformation in the user group and user pattern and eventually to the commercial trend. With the change in lifestyle and need for new amenities have led to addition of new infills leading to the degradation of the native commerce. Hence addressing the transformation of marketplaces are crucial to maintaining the locational significance and cultural importance and heritage of the place.

Keywords: bazaar, market centers, marketplaces, traditional city, traditional market, urban fabric

Procedia PDF Downloads 152
7250 Multi-Channel Charge-Coupled Device Sensors Real-Time Cell Growth Monitor System

Authors: Han-Wei Shih, Yao-Nan Wang, Ko-Tung Chang, Lung-Ming Fu

Abstract:

A multi-channel cell growth real-time monitor and evaluation system using charge-coupled device (CCD) sensors with 40X lens integrating a NI LabVIEW image processing program is proposed and demonstrated. The LED light source control of monitor system is utilizing 8051 microprocessor integrated with NI LabVIEW software. In this study, the same concentration RAW264.7 cells growth rate and morphology in four different culture conditions (DMEM, LPS, G1, G2) were demonstrated. The real-time cells growth image was captured and analyzed by NI Vision Assistant every 10 minutes in the incubator. The image binarization technique was applied for calculating cell doubling time and cell division index. The cells doubling time and cells division index of four group with DMEM, LPS, LPS+G1, LPS+G2 are 12.3 hr,10.8 hr,14.0 hr,15.2 hr and 74.20%, 78.63%, 69.53%, 66.49%. The image magnification of multi-channel CCDs cell real-time monitoring system is about 100X~200X which compares with the traditional microscope.

Keywords: charge-coupled device (CCD), RAW264.7, doubling time, division index

Procedia PDF Downloads 358
7249 The Convergence of IoT and Machine Learning: A Survey of Real-time Stress Detection System

Authors: Shreyas Gambhirrao, Aditya Vichare, Aniket Tembhurne, Shahuraj Bhosale

Abstract:

In today's rapidly evolving environment, stress has emerged as a significant health concern across different age groups. Stress that isn't controlled, whether it comes from job responsibilities, health issues, or the never-ending news cycle, can have a negative effect on our well-being. The problem is further aggravated by the ongoing connection to technology. In this high-tech age, identifying and controlling stress is vital. In order to solve this health issue, the study focuses on three key metrics for stress detection: body temperature, heart rate, and galvanic skin response (GSR). These parameters along with the Support Vector Machine classifier assist the system to categorize stress into three groups: 1) Stressed, 2) Not stressed, and 3) Moderate stress. Proposed training model, a NodeMCU combined with particular sensors collects data in real-time and rapidly categorizes individuals based on their stress levels. Real-time stress detection is made possible by this creative combination of hardware and software.

Keywords: real time stress detection, NodeMCU, sensors, heart-rate, body temperature, galvanic skin response (GSR), support vector machine

Procedia PDF Downloads 72
7248 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data

Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.

Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query

Procedia PDF Downloads 161
7247 Cultural Impact on Fairness Perception of Inequality: A Study on People With Chinese Roots Living in Germany

Authors: Yanping He-Ulbricht, Marc Oliver Rieger

Abstract:

Based on survey data collected from people with Chinese roots living in Germany, this paper examines the impact of assimilation degree and language priming (Chinese or German) on individuals’ perceived fairness of economic and social differences and their attitude towards these. The results show that both the language used and the length of time spent in a foreign culture have a significant impact. Subjects who had spent less than 10 years in Germany demonstrated a higher readiness to accept government intervention in markets with price limits than those who had lived there longer. Subjects who were asked and answered in German perceived the current economic situation as less fair and were also less inclined to accept inequality, even when it leads to a Pareto improvement. While the difference in fairness perception of inequality was a cultural effect, the difference in attitudes towards government intervention was rather a result of learning process. The findings imply that both learning processes of individuals and culture play an important role in perception and preferences regarding social and economic differences.

Keywords: assimilation, bilingualism, cross-cultural comparison, income inequality, language priming, price fairness

Procedia PDF Downloads 86
7246 Prediction of Childbearing Orientations According to Couples' Sexual Review Component

Authors: Razieh Rezaeekalantari

Abstract:

Objective: The purpose of this study was to investigate the prediction of parenting orientations in terms of the components of couples' sexual review. Methods: This was a descriptive correlational research method. The population consisted of 500 couples referring to Sari Health Center. Two hundred and fifteen (215) people were selected randomly by using Krejcie-Morgan-sample-size-table. For data collection, the childbearing orientations scale and the Multidimensional Sexual Self-Concept Questionnaire were used. Result: For data analysis, the mean and standard deviation were used and to analyze the research hypothesis regression correlation and inferential statistics were used. Conclusion: The findings indicate that there is not a significant relationship between the tendency to childbearing and the predictive value of sexual review (r = 0.84) with significant level (sig = 219.19) (P < 0.05). So, with 95% confidence, we conclude that there is not a meaningful relationship between sexual orientation and tendency to child-rearing.

Keywords: couples referring, health center, sexual review component, parenting orientations

Procedia PDF Downloads 219
7245 Evaluating the Effect of Modern Technologies and Technics to Supply Energy of Buildings Using New Energies

Authors: Ali Reza Ghaffari, Hassan Saghi

Abstract:

Given the limitation of fossil resources to supply energy to buildings, recent years have seen a revival of interest in new technologies that produce the energy using new forms of energy in many developed countries. In this research, first the potentials of new energies in Iran are discussed and then based on case studies undertaken in a building in Tehran, the effects of utilizing new solar energy technology for supplying the energy of buildings are investigated. Then, by analyzing the data recorded over a four-year period, the technical performance of this system is investigated. According to the experimental operation plan, this system requires an auxiliary heating circuit for continuous operation over a year. Also, in the economic analysis, real conditions are considered and the results are recorded based on long-term data. Considering the purchase and commissioning building, supplementary energy consumption, etc. a comparison is drawn between the costs of using a solar water heater in a residential unit with the energy costs of a similar unit equipped with a conventional gas water heater. Given the current price of energy, using a solar water heater in the country will not economical, but considering the global energy prices, this system will have a return on investment after 4.5 years. It also produces 81% less pollution and saves about $21.5 on environmental pollution cleanup.

Keywords: energy supply, new energies, new technologies, buildings

Procedia PDF Downloads 162
7244 Sorghum Grains Grading for Food, Feed, and Fuel Using NIR Spectroscopy

Authors: Irsa Ejaz, Siyang He, Wei Li, Naiyue Hu, Chaochen Tang, Songbo Li, Meng Li, Boubacar Diallo, Guanghui Xie, Kang Yu

Abstract:

Background: Near-infrared spectroscopy (NIR) is a non-destructive, fast, and low-cost method to measure the grain quality of different cereals. Previously reported NIR model calibrations using the whole grain spectra had moderate accuracy. Improved predictions are achievable by using the spectra of whole grains, when compared with the use of spectra collected from the flour samples. However, the feasibility for determining the critical biochemicals, related to the classifications for food, feed, and fuel products are not adequately investigated. Objectives: To evaluate the feasibility of using NIRS and the influence of four sample types (whole grains, flours, hulled grain flours, and hull-less grain flours) on the prediction of chemical components to improve the grain sorting efficiency for human food, animal feed, and biofuel. Methods: NIR was applied in this study to determine the eight biochemicals in four types of sorghum samples: hulled grain flours, hull-less grain flours, whole grains, and grain flours. A total of 20 hybrids of sorghum grains were selected from the two locations in China. Followed by NIR spectral and wet-chemically measured biochemical data, partial least squares regression (PLSR) was used to construct the prediction models. Results: The results showed that sorghum grain morphology and sample format affected the prediction of biochemicals. Using NIR data of grain flours generally improved the prediction compared with the use of NIR data of whole grains. In addition, using the spectra of whole grains enabled comparable predictions, which are recommended when a non-destructive and rapid analysis is required. Compared with the hulled grain flours, hull-less grain flours allowed for improved predictions for tannin, cellulose, and hemicellulose using NIR data. Conclusion: The established PLSR models could enable food, feed, and fuel producers to efficiently evaluate a large number of samples by predicting the required biochemical components in sorghum grains without destruction.

Keywords: FT-NIR, sorghum grains, biochemical composition, food, feed, fuel, PLSR

Procedia PDF Downloads 69
7243 Analytical Study of Data Mining Techniques for Software Quality Assurance

Authors: Mariam Bibi, Rubab Mehboob, Mehreen Sirshar

Abstract:

Satisfying the customer requirements is the ultimate goal of producing or developing any product. The quality of the product is decided on the bases of the level of customer satisfaction. There are different techniques which have been reported during the survey which enhance the quality of the product through software defect prediction and by locating the missing software requirements. Some mining techniques were proposed to assess the individual performance indicators in collaborative environment to reduce errors at individual level. The basic intention is to produce a product with zero or few defects thereby producing a best product quality wise. In the analysis of survey the techniques like Genetic algorithm, artificial neural network, classification and clustering techniques and decision tree are studied. After analysis it has been discovered that these techniques contributed much to the improvement and enhancement of the quality of the product.

Keywords: data mining, defect prediction, missing requirements, software quality

Procedia PDF Downloads 467
7242 Cardiovascular Disease Prediction Using Machine Learning Approaches

Authors: P. Halder, A. Zaman

Abstract:

It is estimated that heart disease accounts for one in ten deaths worldwide. United States deaths due to heart disease are among the leading causes of death according to the World Health Organization. Cardiovascular diseases (CVDs) account for one in four U.S. deaths, according to the Centers for Disease Control and Prevention (CDC). According to statistics, women are more likely than men to die from heart disease as a result of strokes. A 50% increase in men's mortality was reported by the World Health Organization in 2009. The consequences of cardiovascular disease are severe. The causes of heart disease include diabetes, high blood pressure, high cholesterol, abnormal pulse rates, etc. Machine learning (ML) can be used to make predictions and decisions in the healthcare industry. Thus, scientists have turned to modern technologies like Machine Learning and Data Mining to predict diseases. The disease prediction is based on four algorithms. Compared to other boosts, the Ada boost is much more accurate.

Keywords: heart disease, cardiovascular disease, coronary artery disease, feature selection, random forest, AdaBoost, SVM, decision tree

Procedia PDF Downloads 153
7241 Prediction of Sepsis Illness from Patients Vital Signs Using Long Short-Term Memory Network and Dynamic Analysis

Authors: Marcio Freire Cruz, Naoaki Ono, Shigehiko Kanaya, Carlos Arthur Mattos Teixeira Cavalcante

Abstract:

The systems that record patient care information, known as Electronic Medical Record (EMR) and those that monitor vital signs of patients, such as heart rate, body temperature, and blood pressure have been extremely valuable for the effectiveness of the patient’s treatment. Several kinds of research have been using data from EMRs and vital signs of patients to predict illnesses. Among them, we highlight those that intend to predict, classify, or, at least identify patterns, of sepsis illness in patients under vital signs monitoring. Sepsis is an organic dysfunction caused by a dysregulated patient's response to an infection that affects millions of people worldwide. Early detection of sepsis is expected to provide a significant improvement in its treatment. Preceding works usually combined medical, statistical, mathematical and computational models to develop detection methods for early prediction, getting higher accuracies, and using the smallest number of variables. Among other techniques, we could find researches using survival analysis, specialist systems, machine learning and deep learning that reached great results. In our research, patients are modeled as points moving each hour in an n-dimensional space where n is the number of vital signs (variables). These points can reach a sepsis target point after some time. For now, the sepsis target point was calculated using the median of all patients’ variables on the sepsis onset. From these points, we calculate for each hour the position vector, the first derivative (velocity vector) and the second derivative (acceleration vector) of the variables to evaluate their behavior. And we construct a prediction model based on a Long Short-Term Memory (LSTM) Network, including these derivatives as explanatory variables. The accuracy of the prediction 6 hours before the time of sepsis, considering only the vital signs reached 83.24% and by including the vectors position, speed, and acceleration, we obtained 94.96%. The data are being collected from Medical Information Mart for Intensive Care (MIMIC) Database, a public database that contains vital signs, laboratory test results, observations, notes, and so on, from more than 60.000 patients.

Keywords: dynamic analysis, long short-term memory, prediction, sepsis

Procedia PDF Downloads 125
7240 Modeling Residential Electricity Consumption Function in Malaysia: Time Series Approach

Authors: L. L. Ivy-Yap, H. A. Bekhet

Abstract:

As the Malaysian residential electricity consumption continued to increase rapidly, effective energy policies, which address factors affecting residential electricity consumption, is urgently needed. This study attempts to investigate the relationship between residential electricity consumption (EC), real disposable income (Y), price of electricity (Pe) and population (Po) in Malaysia for 1978-2011 periods. Unlike previous studies on Malaysia, the current study focuses on the residential sector, a sector that is important for the contemplation of energy policy. The Phillips-Perron (P-P) unit root test is employed to infer the stationary of each variable while the bound test is executed to determine the existence of co-integration relationship among the variables, modeled in an Autoregressive Distributed Lag (ARDL) framework. The CUSUM and CUSUM of squares tests are applied to ensure the stability of the model. The results suggest the existence of long-run equilibrium relationship and bidirectional Granger causality between EC and the macroeconomic variables. The empirical findings will help policy makers of Malaysia in developing new monitoring standards of energy consumption. As it is the major contributing factor in economic growth and CO2 emission, there is a need for more proper planning in Malaysia to attain future targets in order to cut emissions.

Keywords: co-integration, elasticity, granger causality, Malaysia, residential electricity consumption

Procedia PDF Downloads 264
7239 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe

Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis

Abstract:

The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.

Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM

Procedia PDF Downloads 425
7238 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 88
7237 Property and Inheritance Rights for Women Whose Husbands Disappeared during the Last War in Kosovo: Case Studies: Krusha e Vogël and Krusha e Madhe, Region of Prizren, Kosovo

Authors: Venera Goxha

Abstract:

Property and inheritance rights for women whose husbands were killed or disappeared during the last war in Kosovo is the purpose of this study, respectively, the access of these women to family real estate. The case study is about women whose husbands were killed or disappeared during the last war in Kosovo and who, on this occasion, earned the title of 'widow'.The research is conducted in the villages of Krusha e Vogël - Municipality of Prizren, and Krusha e Madhe - Municipality of Rahovec, one of the most suffered villages from the recent war in Kosovo. Krusha e Vogël, as a result of the recent war, has 113 male victims, or 70% of all men from the age of 13 to the age of 77, leaving widows and orphans. In the village of Krusha e Madhe, 243 Albanians were massacred by Serbs living in the same village, leaving widows and orphaned children alive. According to these data, most of the Krushian families, as heads of households, have surviving wives and widows. Therefore, being the head of the family and facing a mountain of challenges, such as economic, social, and cultural, the issue of how these women have approached the property and family heritage is considered. The equal right to property and inheritance is a right that is guaranteed to women with all legislation in force, starting from the Constitution of the Republic of Kosovo onwards. Article 7 of the Constitution of Kosovo and the subsequent legal framework recognizes the equality of women and the equal division of property between heirs, daughters, and sons. However, some of the legislation does not successfully reflect the current reality in Kosovo. All these ambiguities follow from the ‘patriarchal law’ of the Albanians in the time of the early Middle Ages, later known as the ‘Kanun of Lekë Dukagjini’. At the time it was written and applied, it weighted the law in force, but later over time, it passed into tradition, culture, and mentality. The Kanun of Lekë Dukagjini, in no context, has treated women equally to men. The female, according to the Kanun, was a working tool, a creature to be born, to work, to carry, to raise children, and to remain faithful to the husband even when the husband is not faithful.

Keywords: property rights, heritage, widows, code

Procedia PDF Downloads 61
7236 A Real Time Monitoring System of the Supply Chain Conditions, Products and Means of Transport

Authors: Dimitris E. Kontaxis, George Litainas, Dimitris P. Ptochos

Abstract:

Real-time monitoring of the supply chain conditions and procedures is a critical element for the optimal coordination and safety of the deliveries, as well as for the minimization of the delivery time and cost. Real-time monitoring requires IoT data streams, which are related to the conditions of the products and the means of transport (e.g., location, temperature/humidity conditions, kinematic state, ambient light conditions, etc.). These streams are generated by battery-based IoT tracking devices, equipped with appropriate sensors, and are transmitted to a cloud-based back-end system. Proper handling and processing of the IoT data streams, using predictive and artificial intelligence algorithms, can provide significant and useful results, which can be exploited by the supply chain stakeholders in order to enhance their financial benefits, as well as the efficiency, security, transparency, coordination, and sustainability of the supply chain procedures. The technology, the features, and the characteristics of a complete, proprietary system, including hardware, firmware, and software tools -developed in the context of a co-funded R&D programme- are addressed and presented in this paper.

Keywords: IoT embedded electronics, real-time monitoring, tracking device, sensor platform

Procedia PDF Downloads 177
7235 Personalized Infectious Disease Risk Prediction System: A Knowledge Model

Authors: Retno A. Vinarti, Lucy M. Hederman

Abstract:

This research describes a knowledge model for a system which give personalized alert to users about infectious disease risks in the context of weather, location and time. The knowledge model is based on established epidemiological concepts augmented by information gleaned from infection-related data repositories. The existing disease risk prediction research has more focuses on utilizing raw historical data and yield seasonal patterns of infectious disease risk emergence. This research incorporates both data and epidemiological concepts gathered from Atlas of Human Infectious Disease (AHID) and Centre of Disease Control (CDC) as basic reasoning of infectious disease risk prediction. Using CommonKADS methodology, the disease risk prediction task is an assignment synthetic task, starting from knowledge identification through specification, refinement to implementation. First, knowledge is gathered from AHID primarily from the epidemiology and risk group chapters for each infectious disease. The result of this stage is five major elements (Person, Infectious Disease, Weather, Location and Time) and their properties. At the knowledge specification stage, the initial tree model of each element and detailed relationships are produced. This research also includes a validation step as part of knowledge refinement: on the basis that the best model is formed using the most common features, Frequency-based Selection (FBS) is applied. The portion of the Infectious Disease risk model relating to Person comes out strongest, with Location next, and Weather weaker. For Person attribute, Age is the strongest, Activity and Habits are moderate, and Blood type is weakest. At the Location attribute, General category (e.g. continents, region, country, and island) results much stronger than Specific category (i.e. terrain feature). For Weather attribute, Less Precise category (i.e. season) comes out stronger than Precise category (i.e. exact temperature or humidity interval). However, given that some infectious diseases are significantly more serious than others, a frequency based metric may not be appropriate. Future work will incorporate epidemiological measurements of disease seriousness (e.g. odds ratio, hazard ratio and fatality rate) into the validation metrics. This research is limited to modelling existing knowledge about epidemiology and chain of infection concepts. Further step, verification in knowledge refinement stage, might cause some minor changes on the shape of tree.

Keywords: epidemiology, knowledge modelling, infectious disease, prediction, risk

Procedia PDF Downloads 242