Dow Jones Industrial Average Signal Discrete or Continuous
Stock Market
For, instance stock market prices are commonly predicted by analysts either using a methodology that attempts to infer from past patterns of behavior of particular prices and price indices the future performance of these variables, or using a methodology that focuses on the relationship between stock prices and variables that either relate to the economy as a whole or to the characteristics of the company to which the particular financial securities refer.
From: Encyclopedia of Social Measurement , 2005
Data Structures
Andrew F. Siegel , in Practical Business Statistics�(Sixth Edition), 2012
2.4 Time-Series and Cross-Sectional Data
If the data values are recorded in a meaningful sequence, such as daily stock market prices, then you have time-series data. If the sequence in which the data are recorded is irrelevant, such as the first-quarter 2011 earnings of eight aerospace firms, you have cross-sectional data. Cross-sectional is just a fancy way of saying that no time sequence is involved; you simply have a cross-section, or snapshot, of how things are at one particular time.
Analysis of time-series data is generally more complex than cross-sectional data analysis because the ordering of the observations must be carefully taken into account. For this reason, in coming chapters we will initially concentrate on cross-sectional data. Time-series analysis will be covered in Chapter 14.
Example
The Stock Market
Figure 2.4.1 shows a chart of the Dow Jones Industrial Average stock market index, monthly closing value, starting in October, 1928. This time-series data set indicates how the value of a portfolio of stocks has changed through time. Note how the stock market value has risen impressively through much of its history (at least until the fall of 2007) although not entirely smoothly. Note the occasional downward bumps (such as the crash of October 1987, the "dot-com bust" of 2000, and the more recent financial difficulties) that represent the risk that you take by holding a portfolio of stocks that often (but not always) increases in value.
Figure 2.4.1. The Dow Jones Industrial Average stock market index, monthly since 1928, is a time-series data set that provides an overview of the history of the stock market.
Here are some additional examples of time-series data:
- 1.
-
The price of wheat each year for the past 50 years, adjusted for inflation. These time trends might be useful for long-range planning, to the extent that the variation in future events follows the patterns of the past.
- 2.
-
Retail sales, recorded monthly for the past 20 years. This data set has a structure showing generally increasing activity over time as well as a distinct seasonal pattern, with peaks around the December holiday season.
- 3.
-
The thickness of paper as it emerges from a rolling machine, measured once each minute. This kind of data might be important to quality control. The time sequence is important because small variations in thickness may either "drift" steadily toward an unacceptable level, or "oscillate," becoming wider and narrower within fairly stable limits.
Following are some examples of cross-sectional data:
- 1.
-
The number of hours of sleep last night, measured for 30 people being examined to test the effectiveness of a new over-the-counter medication.
- 2.
-
Today's book values of a random sample of a bank's savings certificates.
- 3.
-
The number of phone calls processed yesterday by each of a firm's order-taking employees.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B978012385208300002X
Cash Flow Engineering and Forward Contracts
Salih N. Neftci , in Principles of Financial Engineering (Second Edition), 2008
Hedge Funds Still Bet the Currency's Peg Goes
HONG KONG–The stock market continued to rally last week in the belief the government is buying stocks to drive currency speculators out of the financial markets, though shares ended lower on Friday on profit-taking.
Despite the earlier rally, Hong Kong's economy still is worsening; the stock market hit a five-year low two weeks ago, and betting against the Hong Kong dollar is a cheap and easy wager for speculators.
The government maintains that big hedge funds that wager huge sums in global markets had been scooping up big profits by attacking both the Hong Kong dollar and the stock market.
Under this city's pegged-currency system, when speculators attack the Hong Kong dollar by selling it, that automatically boosts interest rates. Higher rates lure more investors to park their money in Hong Kong, boosting the currency. But they also slam the stock market because rising rates hurt companies' abilities to borrow and expand.
Speculators make money in a falling stock market by short-selling shares—selling borrowed shares in expectation that their price will fall and that the shares can be replaced more cheaply. The difference is the short-seller's profit.
"A lot of hedge funds which operate independently happen to believe that the Hong Kong dollar is overvalued" relative to the weak economy and to other Asian currencies, said Bill Kaye, managing director of hedge-fund outfit Pacific Group Ltd. Mr. Kaye points to Singapore where, because of the Singapore dollar's depreciation in the past year, office rents are now 30% cheaper than they are in Hong Kong, increasing the pressure on Hong Kong to let its currency fall so it can remain competitive.
Hedge funds, meanwhile, "are willing to take the risk they could lose money for some period," he said, while they bet Hong Kong will drop its 15-year-old policy of pegging the local currency at 7.80 Hong Kong dollars to the U.S. dollar.
These funds believe they can wager hundreds of millions of U.S. dollars with relatively little risk. Here's why: If a hedge fund bets the Hong Kong dollar will be toppled from its peg, it's a one-way bet, according to managers of such funds. That's because if the local dollar is dislodged from its peg, it is likely only to fall. And the only risk to hedge funds is that the peg remains, in which case they would lose only their initial cost of entering the trade to sell Hong Kong dollars in the future through forward contracts.
That cost can be low, permitting a hedge fund to eat a loss and make the same bet all over again. When a hedge fund enters a contract to sell Hong Kong dollars in, say, a year's time, it is committed to buying Hong Kong dollars to exchange for U.S. dollars in 12 months. If the currency peg holds, the cost of replacing the Hong Kong dollars it has sold is essentially the difference in 12-month interest rates between the U.S. and Hong Kong.
On Thursday, that difference in interbank interest rates was about 6.3 percentage points. So a fund manager making a US$1 million bet Thursday against the Hong Kong dollar would have paid 6.3%, or US$63,000.
Whether a fund manager wanted to make that trade depends on the odds he assigned to the likelihood of the Hong Kong dollar being knocked of fits peg and how much he expected it then to depreciate.
If he believed the peg would depreciate about 30%, as a number of hedge-fund managers do, then it would have made sense to enter the trade if he thought there was a one-in-four chance of the peg going in a year. That's because the cost of making the trade—US$63,000—is less than one-fourth of the potential profit of a 30% depreciation, or US$300,000. For those who believe the peg might go, "it's a pretty good trade," said Mr. Kaye, the hedge-fund manager. He said that in recent months he hasn't shorted Hong Kong stocks or the currency. Wall Street Journal, August 24, 1998.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780123735744500068
Introduction
Yuliya Mishura , in Finance Mathematics, 2016
Financial markets are often associated with stock markets; sometimes they are differentiated with the view that on financial markets we trade only securities and on the stock market we can trade other values, such as real estate, property and currency. The stock market is also known as the stock exchange. A stock exchange is a market for different kinds of securities, including stocks, bonds and shares, as well as payment documents. As for the randomness, the situation is such that the prices in the financial market, more specifically, in the stock exchange, are affected by many external factors that cannot be predicted in advance and cannot be controlled completely. This is mainly a consequence of economic circumstances, for example, of the state of the world economy and of the local economy, the production levels in some sectors, and the balance between supply and demand. For example, the weather and climate factors can affect a certain type of agricultural product. The activities of large exchange speculators can also have large concequences. Since stock prices at any given time are random, over time they accordingly become random processes. Of course, the same situation occurred even in the days when exchanges existed, but the theory of random processes has not yet been established. Recall that the Chicago Stock Exchange began operating March 21st 1882.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781785480461500065
Financial, Macro and Micro Econometrics Using R
Peter C.B. Phillips , Shuping Shi , in Handbook of Statistics, 2020
6.1 Example 1: The S&P 500 Market
The S&P 500 stock market has been a central focus of attention in global financial markets due to the size of this market and its impact on other financial markets. As an illustration of the methods discussed in this chapter, we conduct a pseudo real-time monitoring exercise for bubbles and crises in this market with the PSY strategy. The sample period runs from January 1973 to July 2018, downloaded monthly from Datastream International. The price-dividend ratios are computed as the inverse of dividend yields. The first step is to import the data to R, using the following code:
-
sp500 <- read.csv("sp500.csv")
-
date <- as.Date(sp500[,1],"%d/%m/%Y")
-
dy <- sp500[,2]
-
pd <- 1/dy
In the presence of a speculative bubble, asset prices characteristically deviate in an explosive way from fundamentals, representing exuberance in the speculative behavior driving the market. In the present case, this deviation implies that the log price-dividend ratio is expected to follow an explosive process over the expansive phase of the bubble. But during crisis periods, the price-dividend ratio is expected to follow a random (downward) drift martingale process, in contrast to a small (local to zero) constant drift martingale process that typically applies under normal market conditions. According to the theory detailed in Sections 3 and 4, we expect to witness rejection of the null hypothesis in the PSY test empirical outcomes during both bubble and crisis periods.
Fig. 2 plots the price-to-dividend ratio of the S&P 500 index. We observe a dramatic increase in the data series in the late 1990s, followed by a rapid fall in the early 2000s. The market experienced another episode of slump in late 2008. With a training period of 47 observations, we start the pseudo real-time monitoring exercise from November 1976 onwards. The PSY test statistics are compared with the 95% bootstrapped critical value. The empirical size is controlled over a 2-year period, i.e., by taking τ b = 24. The lag order is selected by BIC with a maximum lag order of 6, applied to each subsample. The PSY statistic sequence and the corresponding bootstrap critical values can be calculated as follows in R.
Fig. 2. Bubble and crisis periods in the S&P 500 stock market. The solid line is the price-to-dividend ratio and the shaded areas are the periods where the PSY statistic exceeds its 95% bootstrapped critical value.
-
y <-pd
-
obs <-length(y)
-
r0 <-0.01 + 1.8/sqrt(obs)
-
swindow0 <-floor(r0*obs)
-
dim <-obs-swindow0 + 1
-
IC <-2
-
adflag <-6 yr <-2
-
Tb <-12*yr + swindow0-1
-
nboot <-999
-
nCore <-2
-
bsadf <-PSY(y,swindow0,IC,adflag)
-
quantilesBsadf <-cvPSYwmboot(y,swindow0,IC,adflag,Tb,nboot,nCore)
The identified origination and termination dates can be calculated and viewed with the following commands.
-
date <-date[swindow0:obs]
-
quantile95 <-quantilesBsadf%*%matrix(1,nrow = 1,ncol = dim) ind95 <-(bsadf > t(quantile95[2,]))*1
-
OT <-locate(ind95,date)
-
BCdates <-disp(OT,obs)
-
print(BCdates)
where the last two command syntax print the dates on the screen with the first (second) column being the origination (termination) date. The outputs are.
Start | End | |
---|---|---|
1 | 1986-05-30 | 1986-06-30 |
2 | 1987-07-31 | 1987-08-31 |
3 | 1996-01-31 | 1996-01-31 |
4 | 1996-05-31 | 1996-05-31 |
5 | 1996-11-29 | 1997-02-28 |
6 | 1997-04-30 | 1998-07-31 |
7 | 1998-09-30 | 2000-10-31 |
8 | 2000-12-29 | 2001-01-31 |
9 | 2008-10-31 | 2009-02-27 |
The identified periods are shaded in green in Fig. 2. As is evident in the figure, the procedure detects two bubble episode and one crisis episode. The first bubble episode only lasts for 3 months (1986M05–M06 and 1987M08) and occurred before the Black Monday crash on October 1987. The second bubble episode is the well-known dot-com bubble, starting from January 1996 and terminating in October 2000 (with several breaks in between). For the dot-com bubble episode the identified starting date for market exuberance occurs well before the speech of the former chairman of the Federal Reserve Bank Alan Greenspan in December 1996 where the now famous question "how do we know when irrational exuberance has unduly escalated asset values" was posed to the audience and financial world. The identified subprime mortgage crisis starts in October 2008, which is 1 month after the collapse of Lehman Brothers, and terminates in February 2009.
The codes for generating the plot and shaded overlays in the figure are as follows:
-
plot(date,y[swindow0:obs],xlim = c(min(date),max(date)),ylim = c(0.1,1), xlab =",ylab =",type ='l',lwd = 3)
-
for(i in 1:length(date)){
-
if (ind95[i]== 1){abline(v = date[i],col = 3)}} points(date,y[swindow0:obs],type ='l')
-
box(lty = 1)
-
dev.off()
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/S0169716118301068
Inference for two quantitative variables
Stephen C. Loftus , in Basic Statistics with R, 2022
18.6 Practice problems
Say you are interested in the relationship between stock market trading volume ( Volume) and percentage return (Today). This data can be found in the Weekly dataset in the ISLR package in R [78,79].
- 4.
-
Run the test for correlations with the alternative hypothesis at the level.
- 5.
-
Find and interpret the 90% confidence interval for ρ.
Say a researcher wants to investigate the relationship between academic staff pay at universities (acpay) and the number of academic grants the university receives (resgr). Data to investigate this question can be found in the University dataset in the Ecdat package in R [12,80].
- 6.
-
Run the test for correlations with hypotheses versus at the level.
- 7.
-
Find and interpret the 99% confidence interval for ρ.
The Caschool dataset in the Ecdat library looks at school district performance on a standardized test best on various factors [12,107]. Say a researcher wants to test the correlation between test score (testscr) and expenditure per student (expnstu).
- 8.
-
Run the test for correlations with the hypotheses versus at the level.
- 9.
-
Find and interpret the 91% confidence interval for ρ.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128207888000316
Common Frailty versus Contagion in Linear Dynamic Models
Serge Darolles , Christian Gourieroux , in Contagion Phenomena with Applications in Finance, 2015
3.5.1 Joint analysis of market returns
From the 1980s until now, the international stock markets have been increasingly influenced by globalization. Even if most studies argue that the United States market has a leading character, these analysis strongly depend on the period of observation, on the selected market indices and on the type of linear dynamic model used for the analysis. To analyze the transmission between markets, the studies generally consider a model with contagion, but without common factors, and apply Granger causality measures to determine the type of contagion between markets (see section 3.7.1). For instance, Malliaris and Urriata [MAL 92] applied this methodology to six market indices. They found no lead-lag relationship before and after the 1987 crisis, but feedback relationships during the crisis, whereas Arshanapalli and Doukas [ARS 93] claim that, since this crash, a lot of market indices are driven by the United States. Using a same type of methodology on a more recent period, Balios and Xanthakis [BAL 03] also observed that the Dow Jones Industrial Average (DGIA) index is the leading index for the main European (and Japanese) indices, but, if the contagion is direct with the British index, the Footsie, it can pass through the Footsie for other European indices as the French CAC 40.
However, such analyses can be misleading since they omit the possible common factors. They also disregard the stochastic volatility and covolatility effects, as well as the transmission of risks by means of risk premia.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781785480355500033
Correlation and Regression
Andrew F. Siegel , in Practical Business Statistics�(Sixth Edition), 2012
No Relationship
A bivariate data set shows no relationship if the scatterplot is just random, with no tilt (i.e., slanting neither upward nor downward as you move from left to right). The case of no relationship is a special linear relationship that is neither increasing nor decreasing. Such a scatterplot may look like a cloud that is either circular or oval-shaped (the oval points either up and down or left to right; it is not tilted). In fact, by changing the scale for one or the other of your variables, you can make a data set with no relationship have either a circular or an oval-shaped scatterplot.
Example
Short-Term "Momentum" and the Stock Market
Does the stock market have "momentum"? That is, is the market likely to keep going up this week because it went up last week? If there is a relationship between current market performance and the recent past, you would expect to find it in a scatterplot. After all, this is our best statistical tool for seeing the relationship, if any, between market behavior last week (one variable) and market behavior this week (the other variable).
The bivariate data set consists of weekly rates of return for the S&P 500 Stock Market Index, that is, the percent changes (increases or decreases) from one week to the next. 4 Although this seems to be a univariate time series, you can put essentially the same data in the two columns, offsetting the columns by one row so that each value for this week's close (on the left in the table) can be found in the next row (one week later) as last week's close (on the right). This is shown in Table 11.1.7.
Table 11.1.7. Weekly Percent Change in the S&P 500 Stock Market Index
Date | This Week | Last Week |
---|---|---|
1/4/2010 | 2.68% | −1.01% |
1/11/2010 | −0.78 | 2.68 |
1/19/2010 | −3.90 | −0.78 |
1/25/2010 | −1.64 | −3.90 |
2/1/2010 | −0.72 | −1.64 |
2/8/2010 | 0.87 | −0.72 |
2/16/2010 | 3.13 | 0.87 |
2/22/2010 | −0.42 | 3.13 |
3/1/2010 | 3.10 | −0.42 |
3/8/2010 | 0.99 | 3.10 |
3/15/2010 | 0.86 | 0.99 |
3/22/2010 | 0.58 | 0.86 |
3/29/2010 | 0.99 | 0.58 |
4/5/2010 | 1.38 | 0.99 |
4/12/2010 | −0.19 | 1.38 |
4/19/2010 | 2.11 | −0.19 |
4/26/2010 | −2.51 | 2.11 |
5/3/2010 | −6.39 | −2.51 |
5/10/2010 | 2.23 | −6.39 |
5/17/2010 | −4.23 | 2.23 |
5/24/2010 | 0.16 | −4.23 |
6/1/2010 | −2.25 | 0.16 |
6/7/2010 | 2.51 | −2.25 |
6/14/2010 | 2.37 | 2.51 |
6/21/2010 | −3.65 | 2.37 |
6/28/2010 | −4.59 | −3.65 |
Source: Calculated from adjusted closing prices accessed at http://finance.yahoo.com on July 16, 2010.
The scatterplot, in Figure 11.1.9, shows no relationship! There is a lot of random scatter but no trend either upward (which would have suggested momentum) or downward (which would have suggested that the market "overreacted" one week and then corrected itself the next) as you move from left to right in the picture. The correlation, r = 0.026, is close to 0, confirming the lack of a strong relationship. 5
Figure 11.1.9. There is essentially no relationship discernable between this week's and last week's stock market performance. The correlation, r = 0.026, is close to 0, summarizing the lack of a relationship. If last week was a "good" week, then this week's performance will look about the same as if last week had been a "bad" week.
A scatterplot such as this one is consistent with the ideas of market efficiency and random walks. Market efficiency says that all information that is available or can be anticipated is immediately reflected in market prices. Since traders anticipate future changes in market prices, there can be no systematic relationships, and only randomness (i.e., a random walk) can remain. A random walk generates a time series of data with no relationship between previous behavior and the next step or change. 6
By changing the scale of the horizontal or vertical axis, you can make the cloud of points look more like a line. However, because the line is either horizontal or vertical, with no tilt, it still indicates no relationship. These cases are shown in Figures 11.1.10 and 11.1.11.
Figure 11.1.10. There is no relationship here, even though the scatterplot looks like a distinct line, because the line is horizontal, with no tilt. This is the same data set as in Figure 11.1.9, but with the Y scale expanded to flatten the appearance of the plot.
Figure 11.1.11. No relationship is apparent here either, although the scatterplot looks like a distinct line because the line is vertical, with no tilt. In this case the X axis has been expanded from Figure 11.1.9.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780123852083000110
Probability
Gary Smith , in Essential Statistics, Regression, and Econometrics, 2012
4.21 The random walk model of stock prices states that stock market returns are independent of the returns in other periods; for example, whether the stock market does well or poorly in the coming month does not depend on whether it has done well or poorly during the past month, the past 12 months, or the past 120 months. On average, the monthly return on U.S. stocks has been positive about 60 percent of the time and negative about 40 percent of the time. If monthly stock returns are independent with a 0.6 probability of a positive return and a 0.4 probability of a negative return, what is the probability of
- a.
-
12 consecutive positive returns?
- b.
-
12 consecutive negative returns?
- c.
-
A positive return, if the return the preceding month was negative?
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780123822215000040
Measuring dependence and testing for independence
Dag Tjøstheim , ... Bård Støve , in Statistical Modeling Using Local Gaussian Approximation, 2022
7.4.6 Example: Financial returns
In this example, we analyze log-returns of stock market indices for the UK (FTSE 100) and the US (S&P 500) in the period starting from January 1, 2005, to November 15, 2016. Since the financial crisis emerged on August 9, 2007 (see Støve et al., 2014), the data set is divided into three parts: the period from January 1, 2005 to August 8, 2007 is called "Before Crisis", from August 9, 2007 to August 7, 2009 "During Crisis", and from August 8, 2009 to November 15, 2016 "After Crisis". Thus this example relates to the test for financial contagion introduced in Section 6.5.2. However, with the methods developed in this chapter, we can extend this procedure by formally examining whether there are any lead-lag effects using the local Gaussian cross-correlation between different markets.
The dependence between the two series of returns, FTSE100 and S&P 500, is very strong, and, not surprisingly, tests of independence based on the local correlation lead to clear rejection of independence for the pair of indices with . It is more interesting to look at the shape of the local correlation curves with accompanying confidence intervals.
In Figs. 7.4, 7.5, and 7.6, there are, respectively, the plots of the local Gaussian correlation , along the diagonal for the FTSE 100 and the S&P 500 indices for when the data set is not transformed, when it is transformed into standard normal variables using the empirical distribution function, and when the component series are GARCH( )-filtered but not normalized. This is done for the data sets before and during the financial crisis. The GARCH-filter is used because it should remove marginal auto-dependence over time and volatilities effects, as done in Chapter 6. In this way, it is easier to analyze causality and also to see whether, and to what degree, one of the two time series is leading the other.
Figure 7.4. Local Gaussian correlation for FTSE versus S&P with no transformation.
Figure 7.5. Local Gaussian correlation for FTSE versus S&P with marginals transformed into standard normal variables.
Figure 7.6. Local Gaussian correlation for FTSE versus S&P GARCH-filtered.
Note that GARCH filtered data and the use of the ordinary bootstrap when computing, say, confidence intervals implicitly assume that individual GARCH filtering for each component time series and also should result in serially independent and identically distributed observations for the bivariate series . However, we see that this does not really hold for the series (see, e.g., Fig. 7.6c). This means that also the GARCH filtered data should be examined by the stationary or the moving-block bootstrap, and the stationary bootstrap was used in this example to obtain 95% pointwise confidence intervals. Hence, in the test for financial contagion introduced in Chapter 6.5.2, before using the ordinary bootstrap proposed in that test, we need to confirm that GARCH filtered data indeed are iid.
Looking closer at the results for the and time series, Figs. 7.4a and 7.6a demonstrate that both before and during the crisis, the local correlation is increasing both when the market is going down (negative log returns x) and up (positive log returns x). The increase is close to being monotonous and slightly stronger for a falling market. Figs. 7.5a and 7.6a show clear increase of the local correlation during crisis, again most in the falling market case.
In Figs. 7.4b & 7.6b and 7.4c & 7.6c, we try to investigate if there are any (nonlinear) lead-lag effects for the series and . In ordinary linear time series analysis, it is well known that lead-lag effects can be masked by autocorrelation, and this is usually tried removing by prewhitening the series. By GARCH( ) filtering with a Student-t distribution on the residuals, the individual GARCH residuals pass a test of iid residuals. This means that the local cross-correlation plot of Figs. 7.6b and 7.6c should be best suited to conduct a lead-lag investigation. From this figure we see that the series is influenced by the series, not only in the same day, but also from the day before. It seems to be no such clear effect going on in the other direction, which is perhaps slightly surprising when the time difference between the FTSE- and SP-markets is taken into account. In contrast to the zero-lag results, there is now a very clear difference between a market upturn and downturn, the lag 1 effect being much stronger in the downturn case (Fig. 7.6c). Contrasting the zero-lag effects, the before crisis lag 1 effect is consistently higher than the lag 1 effect during crisis. It appears that the crisis has led to a more concentrated zero-lag effect.
We also computed the local correlation for higher lags (2, 3, 4, 5, 10, 15). No clear effects in either direction were found. For similar studies between other market indices, we refer to Lacal and Tjøstheim (2019).
As already mentioned, Figs. 7.5a and 7.6a show clear increase of the local correlation during crisis, which is in line with the findings of Chapter 6.5 and Støve et al. (2014). Indeed, the plots in Figs. 7.4–7.6 give a nonlinear distributional based measure of dependence between the indices, with much more detailed information than the ordinary correlation.
In addition, Lacal and Tjøstheim (2019) also looked at the period from August 7, 2009 to November 15, 2016 for the indices when the data is transformed into standard normal variables. The corresponding plot is given in Fig. 7.7 (note that the scale is different from that of Fig. 7.5a). As we can be notice, the plot in Fig. 7.7 shows the same pattern as the plot in Fig. 7.5a, that is, also for the period 2009–2016, there is a stronger local correlation for downturns and upturns than for a stable market. Moreover, the curves are essentially of the same level as during the financial crisis, in this respect, demonstrating a lasting effect of this crisis. Note that this is consistent with a study by Aastveit et al. (2017) of other economic variables using a linear VAR model, where they identified a structural break in 2008 in a data set extending to 2015.
Figure 7.7. Local Gaussian correlation for FTSE versus S&P with marginals transformed into standard normal variables for the period 2009–2016.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128158616000146
Volume 3
D.A. Donald , ... D. Coomans , in Comprehensive Chemometrics, 2009
3.23.1 Introduction
Signals are all around us! Electromagnetic broadcasts permeating the air, stock market ticker-tape on the floor, Voyager space probe emissions streaming into the universe, and electrical pulses through human nerves – all are different types of signals. Signals are broadly defined as a sequence of measurements that can either be continuous or discrete. Often, sequences are captured at varying intervals in time, frequency, wave numbers, or distance. (While introducing ideas about wavelets and how they can be applied in chemical settings, for simplicity and generality let us consider a signal to be a function of time.) A signal of particular interest is the absorption or reflection of electromagnetic radiation measured in intervals of wavelength, as this signal is characteristic of the chemical and physical composition of the substance being measured.
Another property of signals is if two or more signals overlap, then the resulting overlap is also a signal. This signal addition is usually an undesirable property as it is usually impossible to measure the individual signals independently of each other. A classic example is when a sample contains two molecules having similar absorbance signals – absorption signals with overlapping wavelength regions. When the absorption signal of the sample is measured, a combination of the two absorption signals is recorded simultaneously. The addition of signals poses a problem when information regarding only one of the two molecules is sought.
Sequential measurement and signal addition are essential principles for the analysis of signals. First, the sequential principle gives a signal a quintessential 'shape' and, second, signal addition distorts a sought-after signal/shape by the presence of other signals. Ideally, any method for the analysis of signals should aim at extracting the components of a signal such as expressing the signal as a series of shapes. Wavelet analysis incorporates the principles of both signal addition and sequential shape representation, making wavelets a suitable method for signal analysis.
Wavelet analysis represents signals by a series of orthogonal basis functions. In Fourier analysis, the basis function is the sinusoidal function, whereas in wavelet analysis, the basis function is largely undefined with the exception that the basis function is localized and orthogonal onto itself upon translation or dilation. This flexibility of wavelet basis functions enables a wide range of signal shapes to be investigated within the context of signal addition.
There exist a large suite of functions that fit the wavelet description such as the derivatives of the Gaussian, and because of the popularity of wavelets in the 1990s and 2000s, there exist a large variety of standard wavelet basis to choose from. Some of these basis functions have been given names such as Daubechies wavelets, Coiflets, the Haar wavelet, and the Mexican wavelet. The choice of the wavelet basis is an important issue because the basis function is typically meant to mimic localized information embedded in the signal. The chemometricians can investigate the performance of some of the 'famous' wavelet basis functions, or they can design their own wavelet basis functions. The latter allows wavelet basis functions to be designed to suit both the data set and subsequent analysis method.
The objective of this chapter is to demonstrate how wavelet basis functions can be computed for a range of multivariate statistical tasks such as unsupervised mapping (Section 3.23.3.1), discriminant analysis (Section 3.23.3.2), regression analysis (Section 3.23.3.3), and multivariate analysis of variance (MANOVA) (Section 3.23.3.4). Following a description of the wavelet theory in Section 3.23.2, the extension to adaptive wavelets is described in Section 3.23.2.7. Section 3.23.3 surveys the statistical methods that utilize the adaptive wavelet coefficients, and Section 3.23.4 puts all the above to practice by providing worked examples of the adaptive wavelet feature transformation procedure using near-infrared (NIR) spectra.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780444527011000338
Source: https://www.sciencedirect.com/topics/mathematics/stock-market
Postar um comentário for "Dow Jones Industrial Average Signal Discrete or Continuous"