Search results for: maximum entropy principle
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5430

Search results for: maximum entropy principle

4950 Quality Management in Construction Project

Authors: Harsh Panchal, Saurabh Amrutkar

Abstract:

Quality management is an essential part of any project that has directly related to the performance of a project. Quality management is depended on multiple factors at different stages in a project, right from time management to construction logistics. A project is a mixture of various components that include iternary management, health and safety, crew productivity, and many more. From the survey conducted, we came to the conclusion that advancement in technology and indigenous approach to any project will result in maximum quality standards and better project performance. In this paper, we discuss various components of the factors above that lead to compromise the quality of a project and how it can be controlled in order to achieve maximum quality assurance using quality planning and total quality management. The paper also focuses on limitations and problems faced in each factor responsible for quality management and to tackle them using techniques and processes based on activities and identifying the sequence, approaching critical path, and duration. The project management concept that deals with the sequence of scope cost time give us an overview regarding the ongoing quality management, in a nutshell, giving us hints to regulate the current procedure for maximum achievable quality. It also deals with the problems faced by engineers that make the mundane work process slow, reducing the quality outcome drastically.

Keywords: management, performance, project, quality

Procedia PDF Downloads 161
4949 Field-Programmable Gate Array-Based Baseband Signals Generator of X-Band Transmitter for Micro Satellite/CubeSat

Authors: Shih-Ming Wang, Chun-Kai Yeh, Ming-Hwang Shie, Tai-Wei Lin, Chieh-Fu Chang

Abstract:

This paper introduces a FPGA-based baseband signals generator (BSG) of X-band transmitter developed by National Space Organization (NSPO), Taiwan, for earth observation. In order to gain more flexibility for various applications, a number of modulation schemes, QPSK, DeQPSK and 8PSK 4D-TCM are included. For micro satellite scenario, the maximum symbol rate is up to 150Mbsps, and the EVM is as low as 1.9%. For CubeSat scenario, the maximum symbol rate is up to 60Mbsps, and the EVM is less than 1.7%. The maximum data rates are 412.5Mbps and 165Mbps, respectively. Besides, triple modular redundancy (TMR) scheme is implemented in order to reduce single event effect (SEE) induced by radiation. Finally, the theoretical error performance is provided based on comprehensive analysis, especially when BER is lower and much lower than 10⁻⁶ due to low error bit requirement of modern high-resolution earth remote-sensing instruments.

Keywords: X-band transmitter, FPGA (Field-Programmable Gate Array), CubeSat, micro satellite

Procedia PDF Downloads 291
4948 Sustainability Management Control Adoption and Sustainable Performance of Healthcare Supply Chains in Times of Crisis

Authors: Edward Nartey

Abstract:

Although sustainability management control (SMC) systems provide information that enhances corporate sustainability decisions, reviews on the SMC implications for sustainable supply chains (SCs) demonstrate a wide research gap, particularly the sustainability performance of healthcare SCs in unusual times. This study provides preliminary empirical evidence on the level of SMC adoption and the decision-making implications for the Tripple Bottom Line (TBL) principles of SC sustainability of Ghanaian public healthcare institutions (PHIs). Using a sample of 226 public health managers, the results show that sustainable formal control has a positive and significant impact on economic sustainability but an insignificant effect on social and environmental sustainability. In addition, a positive relationship was established between informal controls and economic and environmental sustainability but an insignificant relationship with social sustainability. Although the findings highlight the prevalence of the SMC system being prioritized over regular MCS in crisis situations, the MCSs are inadequate in promoting PHIs' sustainable behaviours in SCs. It also provides little empirical evidence on the effective enhancement of the TBL principle of SC sustainability perhaps because the SMC is in misalignment with the TBL principle in crisis situations. Thus, in crisis situations, PHIs need to redesign their MCSs to support the integration of sustainability issues in SCs.

Keywords: sustainability management control, informal control, formal control, sustainable supply chain performance

Procedia PDF Downloads 55
4947 Millimeter-Wave Silicon Power Amplifiers for 5G Wireless Communications

Authors: Kyoungwoon Kim, Cuong Huynh, Cam Nguyen

Abstract:

Exploding demands for more data, faster data transmission speed, less interference, more users, more wireless devices, and better reliable service-far exceeding those provided in the current mobile communications networks in the RF spectrum below 6 GHz-has led the wireless communication industry to focus on higher, previously unallocated spectrums. High frequencies in RF spectrum near (around 28 GHz) or within the millimeter-wave regime is the logical solution to meet these demands. This high-frequency RF spectrum is of increasingly important for wireless communications due to its large available bandwidths that facilitate various applications requiring large-data high-speed transmissions, reaching up to multi-gigabit per second, of vast information. It also resolves the traffic congestion problems of signals from many wireless devices operating in the current RF spectrum (below 6 GHz), hence handling more traffic. Consequently, the wireless communication industries are moving towards 5G (fifth generation) for next-generation communications such as mobile phones, autonomous vehicles, virtual reality, and the Internet of Things (IoT). The U.S. Federal Communications Commission (FCC) proved on 14th July 2016 three frequency bands for 5G around 28, 37 and 39 GHz. We present some silicon-based RFIC power amplifiers (PA) for possible implementation for 5G wireless communications around 28, 37 and 39 GHz. The 16.5-28 GHz PA exhibits measured gain of more than 34.5 dB and very flat output power of 19.4±1.2 dBm across 16.5-28 GHz. The 25.5/37-GHz PA exhibits gain of 21.4 and 17 dB, and maximum output power of 16 and 13 dBm at 25.5 and 37 GHz, respectively, in the single-band mode. In the dual-band mode, the maximum output power is 13 and 9.5 dBm at 25.5 and 37 GHz, respectively. The 10-19/23-29/33-40 GHz PA has maximum output powers of 15, 13.3, and 13.8 dBm at 15, 25, and 35 GHz, respectively, in the single-band mode. When this PA is operated in dual-band mode, it has maximum output powers of 11.4/8.2 dBm at 15/25 GHz, 13.3/3 dBm at 15/35 GHz, and 8.7/6.7 dBm at 25/35 GHz. In the tri-band mode, it exhibits 8.8/5.4/3.8 dBm maximum output power at 15/25/35 GHz. Acknowledgement: This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors

Keywords: Microwaves, Millimeter waves, Power Amplifier, Wireless communications

Procedia PDF Downloads 180
4946 Effects of Subsidy Reform on Consumption and Income Inequalities in Iran

Authors: Pouneh Soleimaninejadian, Chengyu Yang

Abstract:

In this paper, we use data on Household Income and Expenditure survey of Statistics Centre of Iran, conducted from 2005-2014, to calculate several inequality measures and to estimate the effects of Iran’s targeted subsidy reform act on consumption and income inequality. We first calculate Gini coefficients for income and consumption in order to study the relation between the two and also the effects of subsidy reform. Results show that consumption inequality has not been always mirroring changes in income inequality. However, both Gini coefficients indicate that subsidy reform caused improvement in inequality. Then we calculate Generalized Entropy Index based on consumption and income for years before and after the Subsidy Reform Act of 2010 in order to have a closer look into the changes in internal structure of inequality after subsidy reforms. We find that the improvement in income inequality is mostly caused by the decrease in inequality of lower income individuals. At the same time consumption inequality has been decreased as a result of more equal consumption in both lower and higher income groups. Moreover, the increase in Engle coefficient after the subsidy reform shows that a bigger portion of income is allocated to consumption on food which is a sign of lower living standard in general. This increase in Engle coefficient is due to rise in inflation rate and relative increase in price of food which partially is another consequence of subsidy reform. We have conducted some experiments on effect of subsidy payments and possible effects of change on distribution pattern and amount of cash subsidy payments on income inequality. Result of the effect of cash payments on income inequality shows that it leads to a definite decrease in income inequality and had a bigger share in improvement of rural areas compared to those of urban households. We also examine the possible effect of constant payments on the increasing income inequality for years after 2011. We conclude that reduction in value of payments as a result of inflation plays an important role regardless of the fact that there may be other reasons. We finally experiment with alternative allocations of transfers while keeping the total amount of cash transfers constant or make it smaller through eliminating three higher deciles from the cash payment program, the result shows that income equality would be improved significantly.

Keywords: consumption inequality, generalized entropy index, income inequality, Irans subsidy reform

Procedia PDF Downloads 230
4945 Methane versus Carbon Dioxide Mitigation Prospects

Authors: Alexander J. Severinsky, Allen L. Sessoms

Abstract:

Atmospheric carbon dioxide (CO₂) has dominated the discussion about the causes of climate change. This is a reflection of the time horizon that has become the norm adopted by the IPCC as the planning horizon. Recently, it has become clear that a 100-year time horizon is much too long, and yet almost all mitigation efforts, including those in the near-term horizon of 30 years, are geared toward it. In this paper, we show that, for a 30-year time horizon, methane (CH₄) is the greenhouse gas whose radiative forcing exceeds that of CO₂. In our analysis, we used radiative forcing of greenhouse gases in the atmosphere since they directly affect the temperature rise on Earth. In 2019, the radiative forcing of methane was ~2.5 W/m² and that of carbon dioxide ~2.1 W/m². Under a business-as-usual (BAU) scenario until 2050, such forcing would be ~2.8 W/m² and ~3.1 W/m², respectively. There is a substantial spread in the data for anthropogenic and natural methane emissions as well as CH₄ leakages from production to consumption. We estimated the minimum and maximum effects of the reduction of these leakages. Such action may reduce the annual radiative forcing of all CH₄ emissions by between ~15% and ~30%. This translates into a reduction of the RF by 2050 from ~2.8 W/m² to ~2.5 W/m² in the case of the minimum effect and to ~2.15 W/m² in the case of the maximum. Under the BAU, we found that the RF of CO₂ would increase from ~2.1 W/m² nowadays to ~3.1 W/m² by 2050. We assumed a reduction of 50% of anthropogenic emission linearly over the next 30 years. That would reduce radiative forcing from ~3.1 W/m² to ~2.9 W/m². In the case of ‘net zero,’ the other 50% of reduction of only anthropogenic emissions would be limited to either from sources of emissions or directly from the atmosphere. The total reduction would be from ~3.1 to ~2.7, or ~0.4 W/m². To achieve the same radiative forcing as in the scenario of maximum reduction of methane leakages of ~2.15 W/m², then an additional reduction of radiative forcing of CO₂ would be approximately 2.7 -2.15=0.55 W/m². This is a much larger value than in expectations from ‘net zero’. In total, one needs to remove from the atmosphere ~660 GT to match the maximum reduction of current methane leakages and ~270 GT to achieve ‘net zero.’ This amounts to over 900 GT in total.

Keywords: methane leakages, methane radiative forcing, methane mitigation, methane net zero

Procedia PDF Downloads 140
4944 Solar-Powered Adsorption Cooling System: A Case Study on the Climatic Conditions of Al Minya

Authors: El-Sadek H. Nour El-deen, K. Harby

Abstract:

Energy saving and environment friendly applications are turning out to be one of the most important topics nowadays. In this work, a simulation analysis using TRNSYS software has been carried out to study the benefit of employing a solar adsorption cooling system under the climatic conditions of Al-Minya city, Egypt. A theoretical model was carried out on a two bed adsorption cooling system employing granular activated carbon-HFC-404A as working pair. Temporal and averaged history of solar collector, adsorbent beds, evaporator and condenser has been shown. System performance in terms of daily average cooling capacity and average coefficient of performance around the year has been investigated. The results showed that maximum yearly average coefficient of performance (COP) and cooling capacity are about 0.26 and 8 kW respectively. The maximum value of the both average cooling capacity and COP cyclic is directly proportional to the maximum solar radiation. The system performance was found to be increased with the average ambient temperature. Finally, the proposed solar powered adsorption cooling systems can be used effectively under Al-Minya climatic conditions.

Keywords: adsorption, cooling, Egypt, environment, solar energy

Procedia PDF Downloads 155
4943 The Connection between the Schwartz Theory of Basic Values and Ethical Principles in Clinical Psychology

Authors: Matej Stritesky

Abstract:

The research deals with the connection between the Schwartz Theory of Basic Values and the ethical principles in psychology, on which the meta-code of ethics the European Federation of Psychological Associations is based. The research focuses on ethically problematic situations in clinical psychology in the Czech Republic. Based on the analysis of papers that identified ethically problematic situations faced by clinical psychologists, a questionnaire of ethically problematic situations in clinical psychology (EPSCP) was created for the purposes of the research. The questionnaire was created to represent situations that correspond to the 4 principles on which the meta-code of ethics the European Federation of Psychological Associations is based. The questionnaire EPSCP consists of descriptions of 32 situations that respondents evaluate on a scale from 1 (psychologist's behaviour is ethically perfectly fine) to 10 (psychologist's behaviour is ethically completely unacceptable). The EPSCP questionnaire, together with Schwartz's PVQ questionnaire, will be presented to 60 psychology students. The relationship between principles in clinical psychology and the values on Schwartz´s value continuum will be described using multidimensional scaling. A positive correlation is assumed between the higher-order value of openness to change and problematic ethical situations related to the principle of integrity; a positive correlation between the value of the higher order of self-transcendence and the principle of respect and responsibility; a positive correlation between the value of the higher order of conservation and the principle of competence; and negative correlation between the value of the higher order of ego strengthening and sensitivity to ethically problematic situations. The research also includes an experimental part. The first half of the students are presented with the code of ethics of the Czech Association of Clinical Psychologists before completing the questionnaires, and to the second half of the students is the code of ethics presented after completing the questionnaires. In addition to reading the code of ethics, students describe the three rules of the code of ethics that they consider most important and state why they chose these rules. The output of the experimental part will be to determine whether the presentation of the code of ethics leads to greater sensitivity to ethically problematic situations.

Keywords: clinical psychology, ethically problematic situations in clinical psychology, ethical principles in psychology, Schwartz theory of basic values

Procedia PDF Downloads 111
4942 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 352
4941 Bayesian Using Markov Chain Monte Carlo and Lindley's Approximation Based on Type-I Censored Data

Authors: Al Omari Moahmmed Ahmed

Abstract:

These papers describe the Bayesian Estimator using Markov Chain Monte Carlo and Lindley’s approximation and the maximum likelihood estimation of the Weibull distribution with Type-I censored data. The maximum likelihood method can’t estimate the shape parameter in closed forms, although it can be solved by numerical methods. Moreover, the Bayesian estimates of the parameters, the survival and hazard functions cannot be solved analytically. Hence Markov Chain Monte Carlo method and Lindley’s approximation are used, where the full conditional distribution for the parameters of Weibull distribution are obtained via Gibbs sampling and Metropolis-Hastings algorithm (HM) followed by estimate the survival and hazard functions. The methods are compared to Maximum Likelihood counterparts and the comparisons are made with respect to the Mean Square Error (MSE) and absolute bias to determine the better method in scale and shape parameters, the survival and hazard functions.

Keywords: weibull distribution, bayesian method, markov chain mote carlo, survival and hazard functions

Procedia PDF Downloads 473
4940 Principle of Progressive Implementation and Education Policy for Former Combatants in Colombia

Authors: Ximena Rincon Castellanos

Abstract:

The research target was analyzed the education public policy of Colombia according to the content of the right to education. One problematic element of that content is the principle of progressive implementation of economic, social and cultural rights. The research included a complete study of public documents and other papers; as well as, one focus group with former combatants in a city where is located one of some 'hogares de paz', which hosts these people after leaving the illegal group. This paper presents a critical approach to the public policy strategies to guarantee education to former combatants and its tension with the right to a progressive implementation. Firstly, education is understood as a technology level without considering higher education. Former combatant attends to SENA and private institutions, which offer technology education and it is counted by the Colombian Government as higher education. Therefore, statistics report a high level of attendance of excombatant to that education level, but actually, they do not expect to study a university carrier. Secondly, the budget approved has been invested in private institutions, despite public institutions are able to include this population and they need more money to strengthen the public offer, which has been considered as a better strategy to ensure education as a human right but not a good, by the special rapporteur on the right to education. As a consequence, the progressive implementation should be a guide to change and improve current strategies, invest the budget available into the public system of education in order to give former combatants the chance to access to universities.

Keywords: higher education, progressive implementation, public service, private offering and technology education

Procedia PDF Downloads 166
4939 Geometric Intuition and Formalism in Passing from Indivisibles to Infinitesimals: Pascal and Leibniz

Authors: Remus Titiriga

Abstract:

The paper focuses on Pascal's indivisibles evolving to Leibniz's infinitesimals. It starts with parallel developments by the two savants in Combinatorics (triangular numbers for Pascal and harmonic triangles for Leibniz) and their implication in determining the sum of mathematical series. It follows with a focus on the geometrical contributions of Pascal. He considered the cycloid and other mechanical curves the epitome of geometric comprehensibility in a series of challenging problems he posed to the mathematical world. Pascal provided the solutions in 1658, in a volume published under the pseudonym of Dettonville, using indivisibles and ratios between curved and straight lines. In the third part, the research follows the impact of this volume on Leibniz as the initial impetus for the elaboration of modern calculus as an algorithmic method disjoint of geometrical intuition. Then paper analyses the further steps and proves that Leibniz's developments relate to his philosophical frame (the search for a characteristic Universalis, the consideration of principle of continuity or the rule of sufficient reason) different from Pascal's and impacting mathematical problems and their solutions. At this stage in Leibniz's evolution, the infinitesimals replaced the indivisibles proper. The last part of the paper starts with speculation around "What if?". Could Pascal, if he lived more, accomplish the same feat? The document uses Pascal's reconstructed philosophical frame to formulate a positive answer. It also proposes to teach calculus with indivisibles and infinitesimals mimicking Pascal and Leibniz's achievements.

Keywords: indivisibles, infinitesimals, characteristic triangle, the principle of continuity

Procedia PDF Downloads 127
4938 Exergy Model for a Solar Water Heater with Flat Plate Collector

Authors: P. Sathyakala, G. Sai Sundara Krishnan

Abstract:

The objective of this paper is to derive an exergy model for a solar water heater with honey comb structure in order to identify the element which has larger irreversibility in the system. This will help us in finding the means to reduce the wasted work potential so that the overall efficiency of the system can be improved by finding the ways to reduce those wastages.

Keywords: exergy, energy balance, entropy balance, work potential, degradation, honey comb, flat plate collector

Procedia PDF Downloads 472
4937 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat

Authors: Purba Biswas, Priyanka Dey

Abstract:

Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.

Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy

Procedia PDF Downloads 65
4936 Dark and Bright Envelopes for Dehazing Images

Authors: Zihan Yu, Kohei Inoue, Kiichi Urahama

Abstract:

We present a method for de-hazing images. A dark envelope image is derived with the bilateral minimum filter and a bright envelope is derived with the bilateral maximum filter. The ambient light and transmission of the scene are estimated from these two envelope images. An image without haze is reconstructed from the estimated ambient light and transmission.

Keywords: image dehazing, bilateral minimum filter, bilateral maximum filter, local contrast

Procedia PDF Downloads 258
4935 Catalytic Pyrolysis of Barley Straw for the Production of Fuels and Chemicals

Authors: Funda Ates

Abstract:

Primary energy sources, such as petroleum, coal and natural gas are principle responsible of world’s energy consumption. However, the rapid worldwide increase in the depletion of these energy sources is remarkable. In addition to this, they have damaging environmentally effect. Renewable energy sources are capable of providing a considerable fraction of World energy demand in this century. Biomass is one of the most abundant and utilized sources of renewable energy in the world. It can be converted into commercial fuels, suitable to substitute for fossil fuels. A high number of biomass types can be converted through thermochemical processes into solid, liquid or gaseous fuels. Pyrolysis is the thermal decomposition of biomass in the absence of air or oxygen. In this study, barley straw has been investigated as an alternative feedstock to obtain fuels and chemicals via pyrolysis in fixed-bed reactor. The influence of pyrolysis temperature in the range 450–750 °C as well as the catalyst effects on the products was investigated and the obtained results were compared. The results indicated that a maximum oil yield of 20.4% was obtained at a moderate temperature of 550 °C. Oil yield decreased by using catalyst. Pyrolysis oils were examined by using instrumental analysis and GC/MS. Analyses revealed that the pyrolysis oils were chemically very heterogeneous at all temperatures. It was determined that the most abundant compounds composing the bio-oil were phenolics. Catalyst decreased the reaction temperature. Most of the components obtained using a catalyst at moderate temperatures was close to those obtained at high temperatures without using a catalyst. Moreover, the use of a catalyst also decreased the amount of oxygenated compounds produced.

Keywords: Barley straw, pyrolysis, catalyst, phenolics

Procedia PDF Downloads 221
4934 Maximum Distance Separable b-Symbol Repeated-Root γ-Constacylic Codes over a Finite Chain Ring of Length 2

Authors: Jamal Laaouine, Mohammed Elhassani Charkani

Abstract:

Let p be a prime and let b be an integer. MDS b-symbol codes are a direct generalization of MDS codes. The γ-constacyclic codes of length pˢ over the finite commutative chain ring Fₚm [u]/ < u² > had been classified into four distinct types, where is a nonzero element of the field Fₚm. Let C₃ be a code of Type 3. In this paper, we obtain the b-symbol distance db(C₃) of the code C₃. Using this result, necessary and sufficient conditions under which C₃ is an MDS b-symbol code are given.

Keywords: constacyclic code, repeated-root code, maximum distance separable, MDS codes, b-symbol distance, finite chain rings

Procedia PDF Downloads 136
4933 Lean Commercialization: A New Dawn for Commercializing High Technologies

Authors: Saheed A. Gbadegeshin

Abstract:

Lean Commercialization (LC) is a transformation of new technologies and knowledge to products and services through application of lean/agile principle. This principle focuses on how resources can be minimized on development, manufacturing, and marketing new products/services, which can be accepted by customers. To understand how the LC has been employed by the technology-based companies, a case study approach was employed by interviewing the founders, observing their high technologies, and interviewing the commercialization experts. Two serial entrepreneurs were interviewed in 2012, and their commercialized technologies were monitored from 2012 till 2016. Some results were collected, but to validate the commercialization strategies of these entrepreneurs, four commercialization experts were interviewed in 2017. Initial results, observation notes, and experts’ opinions were analyzed qualitatively. The final findings showed that the entrepreneurs applied the LC unknowingly, and the experts were aware of the LC. Similarly, the entrepreneurs used the LC due to the financial constraints, and their need for success. Additionally, their commercialization practices revealed that LC appeared to be one of their commercialization strategies. Thus, their practices were analyzed, and a framework was developed. Furthermore, the experts noted that LC is a new dawn, which technologists and scientists need to consider for their high technology commercialization. This article contributes to the theory and practice of commercialization. Theoretically, the framework adds value to the commercialization discussion. And, practically the framework can be used by the technology entrepreneurs (technologists and scientists), technology-based enterprises, and technology entrepreneurship educators as a guide in their commercialization adventures.

Keywords: lean commercialization, high technologies, lean start-up, technology-based companies

Procedia PDF Downloads 161
4932 Issues in Travel Demand Forecasting

Authors: Huey-Kuo Chen

Abstract:

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper.

Keywords: travel choices, B algorithm, entropy maximization, dynamic traffic assignment

Procedia PDF Downloads 454
4931 Agro Morphological Characterization of Vicia Faba L. Accessions in the Kingdom of Saudi Arabia

Authors: Zia Amjad, Salem S. Alghamdi

Abstract:

This experiment was carried out at student educational farm College of Food and Agriculture, KSU, kingdom of Saudi Arabia; in order to characterize 154 V. faba accessions based on UPOV and IBPGR descriptors. 24 agro-morphological characters including 11 quantitative and 13 qualitative were observed for genetic variation. All the results were analyzed using multivariate analysis i.e. principle component analysis (PCA). First six principle components (PC) had Eigen-value greater than one; accounted for 72% of available V. faba genetic diversity. However first three components revealed more than 10% of genetic diversity each i.e. 22.36%, 15.86% and 10.89% respectively. PCA distributed the V. faba accessions into different groups based on their performance for the characters under observation. PC-1 which represented 22.36% of the genetic diversity was positively associated with stipule spot pigmentation, intensity of streaks, pod degree of curvature and to some extent with 100 seed weight. PC-2 covered 15.86 of the genetic diversity and showed positive association for average seed weight per plant, pod length, number of seeds per plant, 100 seed weight, stipule spot pigmentation, intensity of streaks (same as in PC-1) and to some extent for pod degree of curvature and number of pods per plant. PC-3 revealed 10.89% of genetic diversity and expressed positive association for number of pods per plant and number of leaflets per plant.

Keywords: agro morphological characterization, diversity, vicia faba, PCA

Procedia PDF Downloads 107
4930 Principle Component Analysis on Colon Cancer Detection

Authors: N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Rita Magdalena, R. D. Atmaja, Sofia Saidah, Ocky Tiaramukti

Abstract:

Colon cancer or colorectal cancer is a type of cancer that attacks the last part of the human digestive system. Lymphoma and carcinoma are types of cancer that attack human’s colon. Colon cancer causes deaths about half a million people every year. In Indonesia, colon cancer is the third largest cancer case for women and second in men. Unhealthy lifestyles such as minimum consumption of fiber, rarely exercising and lack of awareness for early detection are factors that cause high cases of colon cancer. The aim of this project is to produce a system that can detect and classify images into type of colon cancer lymphoma, carcinoma, or normal. The designed system used 198 data colon cancer tissue pathology, consist of 66 images for Lymphoma cancer, 66 images for carcinoma cancer and 66 for normal / healthy colon condition. This system will classify colon cancer starting from image preprocessing, feature extraction using Principal Component Analysis (PCA) and classification using K-Nearest Neighbor (K-NN) method. Several stages in preprocessing are resize, convert RGB image to grayscale, edge detection and last, histogram equalization. Tests will be done by trying some K-NN input parameter setting. The result of this project is an image processing system that can detect and classify the type of colon cancer with high accuracy and low computation time.

Keywords: carcinoma, colorectal cancer, k-nearest neighbor, lymphoma, principle component analysis

Procedia PDF Downloads 203
4929 Recursion, Merge and Event Sequence: A Bio-Mathematical Perspective

Authors: Noury Bakrim

Abstract:

Formalization is indeed a foundational Mathematical Linguistics as demonstrated by the pioneering works. While dialoguing with this frame, we nonetheless propone, in our approach of language as a real object, a mathematical linguistics/biosemiotics defined as a dialectical synthesis between induction and computational deduction. Therefore, relying on the parametric interaction of cycles, rules, and features giving way to a sub-hypothetic biological point of view, we first hypothesize a factorial equation as an explanatory principle within Category Mathematics of the Ergobrain: our computation proposal of Universal Grammar rules per cycle or a scalar determination (multiplying right/left columns of the determinant matrix and right/left columns of the logarithmic matrix) of the transformable matrix for rule addition/deletion and cycles within representational mapping/cycle heredity basing on the factorial example, being the logarithmic exponent or power of rule deletion/addition. It enables us to propone an extension of minimalist merge/label notions to a Language Merge (as a computing principle) within cycle recursion relying on combinatorial mapping of rules hierarchies on external Entax of the Event Sequence. Therefore, to define combinatorial maps as language merge of features and combinatorial hierarchical restrictions (governing, commanding, and other rules), we secondly hypothesize from our results feature/hierarchy exponentiation on graph representation deriving from Gromov's Symbolic Dynamics where combinatorial vertices from Fe are set to combinatorial vertices of Hie and edges from Fe to Hie such as for all combinatorial group, there are restriction maps representing different derivational levels that are subgraphs: the intersection on I defines pullbacks and deletion rules (under restriction maps) then under disjunction edges H such that for the combinatorial map P belonging to Hie exponentiation by intersection there are pullbacks and projections that are equal to restriction maps RM₁ and RM₂. The model will draw on experimental biomathematics as well as structural frames with focus on Amazigh and English (cases from phonology/micro-semantics, Syntax) shift from Structure to event (especially Amazigh formant principle resolving its morphological heterogeneity).

Keywords: rule/cycle addition/deletion, bio-mathematical methodology, general merge calculation, feature exponentiation, combinatorial maps, event sequence

Procedia PDF Downloads 122
4928 Total Chromatic Number of Δ-Claw-Free 3-Degenerated Graphs

Authors: Wongsakorn Charoenpanitseri

Abstract:

The total chromatic number χ"(G) of a graph G is the minimum number of colors needed to color the elements (vertices and edges) of G such that no incident or adjacent pair of elements receive the same color Let G be a graph with maximum degree Δ(G). Considering a total coloring of G and focusing on a vertex with maximum degree. A vertex with maximum degree needs a color and all Δ(G) edges incident to this vertex need more Δ(G) + 1 distinct colors. To color all vertices and all edges of G, it requires at least Δ(G) + 1 colors. That is, χ"(G) is at least Δ(G) + 1. However, no one can find a graph G with the total chromatic number which is greater than Δ(G) + 2. The Total Coloring Conjecture states that for every graph G, χ"(G) is at most Δ(G) + 2. In this paper, we prove that the Total Coloring Conjectur for a Δ-claw-free 3-degenerated graph. That is, we prove that the total chromatic number of every Δ-claw-free 3-degenerated graph is at most Δ(G) + 2.

Keywords: total colorings, the total chromatic number, 3-degenerated, CLAW-FREE

Procedia PDF Downloads 171
4927 Adaptive Target Detection of High-Range-Resolution Radar in Non-Gaussian Clutter

Authors: Lina Pan

Abstract:

In non-Gaussian clutter of a spherically invariant random vector, in the cases that a certain estimated covariance matrix could become singular, the adaptive target detection of high-range-resolution radar is addressed. Firstly, the restricted maximum likelihood (RML) estimates of unknown covariance matrix and scatterer amplitudes are derived for non-Gaussian clutter. And then the RML estimate of texture is obtained. Finally, a novel detector is devised. It is showed that, without secondary data, the proposed detector outperforms the existing Kelly binary integrator.

Keywords: non-Gaussian clutter, covariance matrix estimation, target detection, maximum likelihood

Procedia PDF Downloads 462
4926 A Novel Machining Method and Tool-Path Generation for Bent Mandrel

Authors: Hong Lu, Yongquan Zhang, Wei Fan, Xiangang Su

Abstract:

Bent mandrel has been widely used as precise mould in automobile industry, shipping industry and aviation industry. To improve the versatility and efficiency of turning method of bent mandrel with fixed rotational center, an instantaneous machining model based on cutting parameters and machine dimension is prospered in this paper. The spiral-like tool path generation approach in non-axisymmetric turning process of bent mandrel is developed as well to deal with the error of part-to-part repeatability in existed turning model. The actual cutter-location points are calculated by cutter-contact points, which are obtained from the approach of spiral sweep process using equal-arc-length segment principle in polar coordinate system. The tool offset is set to avoid the interference between tool and work piece is also considered in the machining model. Depend on the spindle rotational angle, synchronization control of X-axis, Z-axis and C-axis is adopted to generate the tool-path of the turning process. The simulation method is developed to generate NC program according to the presented model, which includes calculation of cutter-location points and generation of tool-path of cutting process. With the approach of a bent mandrel taken as an example, the maximum offset of center axis is 4mm in the 3D space. Experiment results verify that the machining model and turning method are appropriate for the characteristics of bent mandrel.

Keywords: bent mandrel, instantaneous machining model, simulation method, tool-path generation

Procedia PDF Downloads 333
4925 The Effects of Three Months of HIIT on Plasma Adiponectin on Overweight College Men

Authors: M. J. Pourvaghar, M. E. Bahram, M. Sayyah, Sh. Khoshemehry

Abstract:

Adiponectin is a cytokine secreted by the adipose tissue that functions as an anti-inflammatory, antiathrogenic and anti-diabetic substance. Its density is inversely correlated with body mass index. The purpose of this research was to examine the effect of 12 weeks of high intensity interval training (HIIT) with the level of serum adiponectin and some selected adiposity markers in overweight and fat college students. This was a clinical research in which 24 students with BMI between 25 kg/m2 to 30 kg/m2. The sample was purposefully selected and then randomly assigned into two groups of experimental (age =22.7±1.5 yr.; weight = 85.8±3.18 kg and height =178.7±3.29 cm) and control (age =23.1±1.1 yr.; weight = 79.1±2.4 kg and height =181.3±4.6 cm), respectively. The experimental group participated in an aerobic exercise program for 12 weeks, three sessions per weeks at a high intensity between 85% to 95% of maximum heart rate (considering the over load principle). Prior and after the termination of exercise protocol, the level of serum adiponectin, BMI, waist to hip ratio, and body fat percentages were calculated. The data were analyzed by using SPSS: PC 16.0 and statistical procedure such as ANCOVA, was used. The results indicated that 12 weeks of intensive interval training led to the increase of serum adiponectin level and decrease of body weight, body fat percent, body mass index and waist to hip ratio (P < 0.05). Based on the results of this research, it may be concluded that participation in intensive interval training for 12 weeks is a non-invasive treatment to increase the adiponectin level while decreasing some of the anthropometric indices associated with obesity or being overweight.

Keywords: adiponectin, cardiovascular, interval, overweight, training

Procedia PDF Downloads 314
4924 Experimental Measurements of Mean and Turbulence Quantities behind the Circular Cylinder by Attaching Different Number of Tripping Wires

Authors: Amir Bak Khoshnevis, Mahdieh Khodadadi, Aghil Lotfi

Abstract:

For a bluff body, roughness elements in simulating a turbulent boundary layer, leading to delayed flow separation, a smaller wake, and lower form drag. In the present work, flow past a circular cylinder with using tripping wires is studied experimentally. The wind tunnel used for modeling free stream is open blow circuit (maximum speed = 30m/s and maximum turbulence of free stream = 0.1%). The selected Reynolds number for all tests was constant (Re = 25000). The circular cylinder selected for this experiment is 20 and 400mm in diameter and length, respectively. The aim of this research is to find the optimal operation mode. In this study installed some tripping wires 1mm in diameter, with a different number of wires on the circular cylinder and the wake characteristics of the circular cylinder is studied. Results showed that by increasing number of tripping wires attached to the circular cylinder (6, 8, and 10, respectively), The optimal angle for the tripping wires with 1mm in diameter to be installed on the cylinder is 60̊ (or 6 wires required at angle difference of 60̊). Strouhal number for the cylinder with tripping wires 1mm in diameter at angular position 60̊ showed the maximum value.

Keywords: wake of circular cylinder, trip wire, velocity defect, strouhal number

Procedia PDF Downloads 396
4923 Experimental and Computational Fluid Dynamics Analysis of Horizontal Axis Wind Turbine

Authors: Saim Iftikhar Awan, Farhan Ali

Abstract:

Wind power has now become one of the most important resources of renewable energy. The machine which extracts kinetic energy from wind is wind turbine. This work is all about the electrical power analysis of horizontal axis wind turbine to check the efficiency of different configurations of wind turbines to get maximum output and comparison of experimental and Computational Fluid Dynamics (CFD) results. Different experiments have been performed to obtain that configuration with the help of which we can get the maximum electrical power output by changing the different parameters like the number of blades, blade shape, wind speed, etc. in first step experimentation is done, and then the similar configuration is designed in 3D CAD software. After a series of experiments, it has been found that the turbine with four blades at an angle of 75° gives maximum power output and increase in wind speed increases the power output. The models designed on CAD software are imported on ANSYS-FLUENT to predict mechanical power. This mechanical power is then converted into electrical power, and the results were approximately the same in both cases. In the end, a comparison has been done to compare the results of experiments and ANSYS-FLUENT.

Keywords: computational analysis, power efficiency, wind energy, wind turbine

Procedia PDF Downloads 155
4922 Parameters Estimation of Power Function Distribution Based on Selective Order Statistics

Authors: Moh'd Alodat

Abstract:

In this paper, we discuss the power function distribution and derive the maximum likelihood estimator of its parameter as well as the reliability parameter. We derive the large sample properties of the estimators based on the selective order statistic scheme. We conduct simulation studies to investigate the significance of the selective order statistic scheme in our setup and to compare the efficiency of the new proposed estimators.

Keywords: fisher information, maximum likelihood estimator, power function distribution, ranked set sampling, selective order statistics sampling

Procedia PDF Downloads 458
4921 Frequency Reconfigurable Multiband Patch Antenna Using PIN-Diode for ITS Applications

Authors: Gaurav Upadhyay, Nand Kishore, Prashant Ranjan, V. S. Tripathi, Shivesh Tripathi

Abstract:

A frequency reconfigurable multiband antenna for intelligent transportation system (ITS) applications is proposed in this paper. A PIN-diode is used for reconfigurability. Centre frequencies are 1.38, 1.98, 2.89, 3.86, and 4.34 GHz in “ON” state of Diode and 1.56, 2.16, 2.88, 3.91 and 4.45 GHz in “OFF” state. Achieved maximum bandwidth is 18%. The maximum gain of the proposed antenna is 2.7 dBi in “ON” state and 3.95 dBi in “OFF” state of the diode. The antenna is simulated, fabricated, and tested in the lab. Measured and simulated results are in good confirmation.

Keywords: ITS, multiband antenna, PIN-diode, reconfigurable

Procedia PDF Downloads 342