Showing posts with label Forecasting Forum. Show all posts
Thursday, February 3, 2022
New NBER Working paper by Nicholas Bloom, Takafumi Kawakubo, Charlotte Meng, Paul Mizen, Rebecca Riley, Tatsuro Senga & John Van Reenen.
“We link a new UK management survey covering 8,000 firms to panel data on productivity in manufacturing and services. There is a large variation in management practices, which are highly correlated with productivity, profitability and size. Uniquely, the survey collects firms’ micro forecasts of their own sales and also macro forecasts of GDP. We find that better managed firms make more accurate micro and macro forecasts, even after controlling for their size, age, industry and many other factors. We also show better managed firms appear aware that their forecasts are more accurate, with lower subjective uncertainty around central values. These stylized facts suggest that one reason for the superior performance of better managed firms is that they knowingly make more accurate forecasts, enabling them to make superior operational and strategic choices.”
Read more here.
New NBER Working paper by Nicholas Bloom, Takafumi Kawakubo, Charlotte Meng, Paul Mizen, Rebecca Riley, Tatsuro Senga & John Van Reenen.
“We link a new UK management survey covering 8,000 firms to panel data on productivity in manufacturing and services. There is a large variation in management practices, which are highly correlated with productivity, profitability and size. Uniquely, the survey collects firms’ micro forecasts of their own sales and also macro forecasts of GDP.
Posted by 2:22 PM
atLabels: Forecasting Forum
New paper by Tianyi Wang, Hong Yan, Zhuo Huang & Fang Liang in Economic Modelling.
“In this paper, we develop a new model, the Realized GARCH-RSRK, to determine the time-varying distribution of financial returns with realized higher moments. Based on Gram-Charlier expansion (GCE) density, we first explicitly link the expansion parameters with moments that are calculated based on intraday returns using our new model. Then, the Cornish-Fisher expansion is applied to forecast Value-at-Risk (VaR) with estimated moments to demonstrate the economic significance of this new model. Compared with the daily-return-based dynamic higher moments models, the inclusion of realized higher moments significantly improves this model’s ability to forecast extreme tails. The empirical results indicate that this new model outperforms the benchmark models when forecasting extreme VaR. In addition, we provide a formula to correct the moments associated with the commonly used squared transformation of GCE. Our empirical evidence highlights the importance of using corrected moments in VaR forecasting.”
Read more by clicking here.
New paper by Tianyi Wang, Hong Yan, Zhuo Huang & Fang Liang in Economic Modelling.
“In this paper, we develop a new model, the Realized GARCH-RSRK, to determine the time-varying distribution of financial returns with realized higher moments. Based on Gram-Charlier expansion (GCE) density, we first explicitly link the expansion parameters with moments that are calculated based on intraday returns using our new model. Then, the Cornish-Fisher expansion is applied to forecast Value-at-Risk (VaR) with estimated moments to demonstrate the economic significance of this new model.
Posted by 1:16 PM
atLabels: Forecasting Forum
Sunday, January 30, 2022
New article by John Cochrane from John Cochrane’s blog.
“Torsten Slok, chief economist at Apollo Global Management, passes along the above gorgeous graph. Fed forecasts of interest rates behave similarly. So does the “market forecast” embedded in the yield curve, which usually slopes upward.
Torsten’s conclusion:
The forecasting track record of the economics profession when it comes to 10-year interest rates is not particularly impressive, see chart [above]. Since the Philadelphia Fed started their Survey of Professional Forecasters twenty years ago, the economists and strategists participating have been systematically wrong, predicting that long rates would move higher. Their latest release has the same prediction.
Well. Like the famous broken clock that is right twice a day, note the forecasts are “right” in times of higher rates. So don’t necessarily run out and buy bonds today.
Can it possibly be true that professional forecasters are simply behaviorally dumb, refuse to learn, and the institutions that hire them refuse to hire more rational ones?”
To read more click here.
New article by John Cochrane from John Cochrane’s blog.
“Torsten Slok, chief economist at Apollo Global Management, passes along the above gorgeous graph. Fed forecasts of interest rates behave similarly. So does the “market forecast” embedded in the yield curve, which usually slopes upward.
Torsten’s conclusion:
The forecasting track record of the economics profession when it comes to 10-year interest rates is not particularly impressive, see chart [above].
Posted by 10:56 AM
atLabels: Forecasting Forum
Monday, January 24, 2022
New post from livewire by JOEY MUI of Merlon Capital
Most forecasts begin with a starting point which is often anchored to current data. Forecasters tend to modestly extrapolate up or down from this level. This tendency to stick close to current conditions, or consensus views, limits a forecaster’s ability to comprehend the full range of possibilities or the impacts of more extreme circumstances.
Research by the International Monetary Fund explored the ability of economists to predict recessions between 1992 to 2014. It was a disaster. Economists consistently failed to predict a recession in GDP by a significant margin. Even as conditions deteriorated, economists stubbornly anchored their forecasts to the preceding non-recessionary period and adjusted their predictions downwards too little, too late.”
Click here to read more.
New post from livewire by JOEY MUI of Merlon Capital
“The problem with precision
Most forecasts begin with a starting point which is often anchored to current data. Forecasters tend to modestly extrapolate up or down from this level. This tendency to stick close to current conditions, or consensus views, limits a forecaster’s ability to comprehend the full range of possibilities or the impacts of more extreme circumstances.
Research by the International Monetary Fund explored the ability of economists to predict recessions between 1992 to 2014.
Posted by 7:57 AM
atLabels: Forecasting Forum
Wednesday, January 19, 2022
New paper by Olivier Sprangers, Sebastian Schelter & Maartende Rijke
Abstract:
Probabilistic time series forecasting is crucial in many application domains, such as retail, ecommerce, finance, and biology. With the increasing availability of large volumes of data, a number of neural architectures have been proposed for this problem. In particular, Transformer-based methods achieve state-of-the-art performance on real-world benchmarks. However, these methods require a large number of parameters to be learned, which imposes high memory requirements on the computational resources for training such models. To address this problem, we introduce a novel bidirectional temporal convolutional network that requires an order of magnitude fewer parameters than a common Transformer-based approach. Our model combines two temporal convolutional networks: the first network encodes future covariates of the time series, whereas the second network encodes past observations and covariates. We jointly estimate the parameters of an output distribution via these two networks. Experiments on four real-world datasets show that our method performs on par with four state-of-the-art probabilistic forecasting methods, including a Transformer-based approach and WaveNet, on two point metrics (sMAPE and NRMSE) as well as on a set of range metrics (quantile loss percentiles) in the majority of cases. We also demonstrate that our method requires significantly fewer parameters than Transformer-based methods, which means that the model can be trained faster with significantly lower memory requirements, which as a consequence reduces the infrastructure cost for deploying these models.
Read more here.
New paper by Olivier Sprangers, Sebastian Schelter & Maartende Rijke
Abstract:
Probabilistic time series forecasting is crucial in many application domains, such as retail, ecommerce, finance, and biology. With the increasing availability of large volumes of data, a number of neural architectures have been proposed for this problem. In particular, Transformer-based methods achieve state-of-the-art performance on real-world benchmarks. However, these methods require a large number of parameters to be learned, which imposes high memory requirements on the computational resources for training such models.
Posted by 5:37 PM
atLabels: Forecasting Forum
Subscribe to: Posts